NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
Apple Silicon M1: A Developer's Perspective (steipete.com)
X-Istence 1234 days ago [-]
I'm a developer as well and don't use apps like VSCode, or Java/Gradle... but instead develop open source projects running Python, and Rust.

1. For my open source projects, running tests has sped up so much that I find it more of a joy to work on this projects now. Test suites that take ~60 seconds on my 2017 MacBook Pro i7, now take just ~15 seconds. That's a HUGE win in the edit, run test suite cycle.

2. My Rust projects now compile in WAY less time. Something that used to take 3 minutes to compile + link for a debug build is now down to 40 seconds. Once again, this has increased my productivity in the edit, compile, test cycle.

For other projects like node/npm the speedups are also massive and running various pipelines to go from source code to final deployable object is also much faster.

I use vim to develop, and it runs on Rosetta just fine, so I can continue to use all my existing plugins. I don't extensively use docker containers, but for those I connect to the cloud instead of running them locally, which is a move I was already making even on my Intel based MacBook Pro.

I got the 16GB 13" MacBook Pro. I, like the blog author, was thinking of this as a secondary laptop, something fun to have around, but I have hardly touched my old 15" MacBook Pro because this one does everything I already need. And does it faster.

I've been living with 16GB of RAM forever, so maybe I am used to fitting in that memory space... I avoid Electron based apps like the plague, and with being able to run iPad/iOS apps on the M1, I've been able to cut down even further since things like Authy are way faster to launch when they aren't large Chrome based hogs.

I am looking forward to the 16" MacBook Pro's with Apple Silicon based processors, but only because I do miss the additional screen real-estate, and prefer the larger size. But the 13" MacBook Pro M1 has been absolutely fantastic so far.

spideymans 1234 days ago [-]
>I've been living with 16GB of RAM forever, so maybe I am used to fitting in that memory space... I avoid Electron based apps like the plague, and with being able to run iPad/iOS apps on the M1, I've been able to cut down even further since things like Authy are way faster to launch when they aren't large Chrome based hogs.

iOS apps on the Mac are a game changer for me. The UX isn't 100% perfect, but it sure is better than web apps, which are very often slow, resource intensive and kill your battery. I haven't touched the Twitter web app since I've installed the iOS/Catalyst app on my Mac. Even with its iOS-optimized user interface, the better performance of the iOS app makes it so much more enjoyable than the web version.

I'm very much looking forward to seeing other iOS apps arrive on the Mac, to replace their web variants.

There's really something to be said about how web developers have been killing the performance of their websites with their bloated frameworks or whatever. I know web apps will always be at a performance disadvantage compared to native apps, but the performance regression has gotten absurd. Using Reddit via the Apollo iOS app is a quick and buttery smooth experience, meanwhile the redesigned Reddit webpage is a laggy stuttering mess on my MacBook Pro.

kitsunesoba 1234 days ago [-]
> iOS apps on the Mac are a game changer for me. The UX isn't 100% perfect, but it sure is better than web apps, which are very often slow, resource intensive and kill your battery. I haven't touched the Twitter web app since I've installed the iOS/Catalyst app on my Mac. Even with its iOS-optimized user interface, the better performance of the iOS app makes it so much more enjoyable than the web version.

Totally agree, particularly for small utility apps and the like. Most good iOS apps launch instantly and sip power even on comparatively limited iPhone and iPad batteries, so the impact they have on a Macbook battery should be close to nothing. Night and day compared to electron stuff.

lstamour 1234 days ago [-]
Similarly, it’s a joy to use the Jira app for macOS which uses Catalyst to port their iPad app to macOS. Compared to how long it takes to load and use Jira on the web, it’s just a bit snappier and nice. The dark mode support is also appreciated, though aesthetics could still improve to be a bit more mac-native... and because of Jira’s design, it’s possible the app will always lag in functionality compared to the website. But still— progress! :)
nicoburns 1234 days ago [-]
Are you serious? The JIRA app is by far the worst native mac app I've ever used. If their website is anything to go by then it's not the underlying technologies fault, but I found the JIRA mac app to be just as slow as the website.
lstamour 1234 days ago [-]
I'm serious, yes, it's better. I'm not going to say it's as good as other apps. I mean, if you're keeping a to-do list, better to do that in Apple Reminders -- especially now that they've added that new shared list feature and assigning todos. But if you have to put up with Jira, then yes, I personally find the Jira app to be faster than the website. It reminds me why I like apps as much (or more) than websites.

I've also run Jira on a server before, and if you optimize the Java runtime's settings and over-provision, you can get a pretty fast Jira instance going. It's just most folks rarely touch the default configurations, which are terrible, or don't shell out for SSDs when running it. And the cloud version ... well, it seems a bit under-provisioned if you ask me. Jira doesn't appear to be written well enough to take full advantage of the cloud, yet.

girvo 1234 days ago [-]
I wonder how it compares to the existing Mac native Jira app?
lstamour 1234 days ago [-]
The Jira Cloud iPad app does not appear in the App Store. Apps can opt out from the macOS App Store, and this is especially useful if there already is a macOS version. Why? Generally, any macOS version of an app -- even one like Jira which is nearly a straight port of the iOS app -- is 10x better than the iOS version. Running iOS apps on the M1 is a lot rougher around the edges compared to running Intel apps on the M1. I expect Apple will polish up how iOS apps run on macOS in future releases, if they maintain the feature in its current form.
someperson 1234 days ago [-]
I think a lot of the performance gain is due to switching from an aging 2017 Intel processor to a modern processor on a recent TSMC node (5nm in Apple's case).

Had Apple switched to AMD Ryzen processors on TSMC's 7nm node the performance improvement will have been similarly stark.

dreamcompiler 1234 days ago [-]
I think the most important performance benefit of the M1 is having all RAM within a few millimeters of the CPU. Can you do that with a Ryzen?
Rillen 1233 days ago [-]
Huh? Is there anything to support your assumption?
dreamcompiler 1233 days ago [-]
The speed of light is a huge limiting factor on modern processors.

https://arstechnica.com/science/2014/08/are-processors-pushi...

Light travels about a foot in 1 nanosecond so moving RAM from a foot away to touching the CPU saves 2 ns in round-trip latency. That's a big win on modern systems.

In reality it's a bigger effect since electric signals travel slower than light. And the effect is yet bigger because off-package memory must be accessed through buses with several levels of synchronization and buffer and gate delays.

Apple gets away with "unified memory" in the M1 because having the memory on-package means a good deal of the bus and sync and contention logic becomes unnecessary. So everything that touches RAM gets a lot faster. And almost everything touches RAM.

https://www.theregister.com/2020/11/19/apple_m1_high_bandwid...

https://www.macworld.com/article/3597569/m1-macs-memory-isnt...

wulfklaue 1233 days ago [-]
Plenty move advantages then just round trip latency. The fact that its a single SOC, what in turn reduces traces on the MB. Freeing up real-estate.

We are finally moving to the conversion point between smartphone's and PCs. It has been in the making for a while.

Now, if that is a good thing for the consumers ( losing upgradability ) is another thing? But given how PC's have become less and less flexible anyway.

We used to have slots for everything ( GPU, Network, Audio, ...). Now we have ATX boards where you maximally put in a GPU. SATA is going the way of the dodo with m.2 becoming the new standard for anything data. I mean, what do we even change anymore on PC's... CPU, Memory, GPU.

GPU's will become a external device with USB4. Mass storage has been moving to NAS more and more or USB3 devices. The people that need 4+ HDD's are more exceptions. And you can get away with USB3 HUB's + external enclosures.

I questioned for a long time, why we still have Chipset, that artificially segment MB's when the difference has become very small between them anyway. You can easily move that last bit of IO into the CPU SOC.

The days that we buy AMD or Intel SOC's with some default CPU+RAM+IO is probably closer then what most think.

Separate hardware is probably going to become a Server / Workstation Pro only feature ( with big $$$ prices ).

The reality is that hardware has reached a point, that most people did not even upgrade for years anymore. And its more a smaller group / minority that really needs ultra fast hardware.

Flexibility is moved from big MB's to external devices, connected over high speed connections.

Sorry if i have gone a bit off topic but when you mentioned the onboard memory, it got me thinking about how we really are moving to a SOC/NUC/... future for even powerful hardware.

read_if_gay_ 1234 days ago [-]
But not the battery life.
nightski 1234 days ago [-]
That's not entirely true. My Ryzen 4900hs laptop gets great battery life with integrated graphics. Not as good as the m1, but it's a zen 2 chip and we have yet to see what zen 3 mobile will do.

Apple clearly has a huge advantage here with a cutting edge process and for all practical purposes unlimited money to throw at this. Their cash on hand is more than AMD s full market cap.

threeseed 1234 days ago [-]
Can you clarify what you mean by "great" i.e. in hours.

From all benchmarks the Apple M1 laptops are around 1.8-2x more battery life than anything from AMD/Intel.

neogodless 1234 days ago [-]
My friend sent me a screen shot from `powercfg /batteryreport` on his Asus Zephyrus G14 (Ryzen 9 4900HS) and it had consumed 16,704 Wh of battery in 1:47:27 of active use. That tracks to about 8.5 hours (capacity showed just under 80 Wh).
X-Istence 1234 days ago [-]
I've been averaging about ~15 hours per charge on my 13" MacBook Pro M1 so far.

That includes compiling/running tests/and not just web browsing/checking mail.

dr_zoidberg 1233 days ago [-]
Going by TSMCs claim of 30% better power consumption at iso-speed would put the battery of the G14 at 11-12 hours. So that places the Apple system on a 25-36% better battery range, instead of 80-100%.

And still, there are more catches: Apple is probably driving the screen at 60Hz instead of 120Hz for the G14, and the screen tends to be one of the main drivers of battery life.

What I'm trying to say is that the talks revolve around the M1 vs x86 for battery consumption, but the big savings can probably come from other components in the system.

Rillen 1233 days ago [-]
Its such a hard comparision though.

You are running the newest of the newest.

My Lenovo Thinkpad X390 which i'm sitting on right now for a while, still has ~8h left to go.

Chrome with 30 tabs, intellij etc. open and running.

I do assume the M1 is doing a great job, don't get me wrong, but i would like to see a more objective comparison.

Personally speaking, the most interesting thing for me to see is that Apple now put a very strong SoC on all its models. From Air to Pro, with a relative good price point (as long as you don't upgrade anything...)

meditative 1234 days ago [-]
mWh? Otherwise that's one hell of a battery.
neogodless 1234 days ago [-]
Oops I missed my edit window! That'd also be some insane power consumption! Let's pretend I'm not from the U.S.A. and the comma is used as a decimal.
totalZero 1234 days ago [-]
Power efficiency is indeed one of the benefits of smaller process nodes, so it has a direct impact on battery life.
tbrock 1234 days ago [-]
Or the coolness (lack of heat).

I bought a laptop with an AMD chip once, never again. Thing was a fire hazard.

adar 1234 days ago [-]
Gotta be more specific when you say AMD chip. Was it a newer Zen chip or one of the old ones pre-Ryzen?
chii 1234 days ago [-]
> Test suites that take ~60 seconds on my 2017 MacBook Pro i7, now take just ~15 seconds.

wow, a 5x improvement without you doing anything but changing hardware!? That's an excellent upgrade imho.

I wish apple would be able to sell the M1 as a generic processor to which other machines can use.

atq2119 1234 days ago [-]
Oh they're able to sell the M1 however they want. They just don't want to.
X-Istence 1234 days ago [-]
Yeah, it is absolutely insane for my workloads.
runeks 1234 days ago [-]
> Test suites that take ~60 seconds on my 2017 MacBook Pro i7, now take just ~15 seconds.

Interesting! Are you compiling to arm64 or emulating x86?

X-Istence 1234 days ago [-]
This is Python 3.6 -> 3.9 compiled to Intel running under Rosetta 2.

For Rust I am running the ARM toolchain that was recently released, so no Rosetta involved at all.

dr_zoidberg 1233 days ago [-]
Do you think moving from python 3.6 to 3.6 had any impact whatsoever? I'm surprised by such a large improvement in performance as other reports I read indicated similar-to-slightly-worse python performance of M1 vs x86.
X-Istence 1233 days ago [-]
I run test suite across those Python's. So Python 3.6, 3.7, 3.8, and 3.9 using tox to drive the automation to test against all of those versions to validate compatibility and that nothing breaks.

So there is no moving anything. Python compiled using pyenv for the Intel platform.

Which reports are you reading? Are they using C libraries or doing heavy math computation that is not CPU bound because there's no optimized ARM version yet?

dr_zoidberg 1233 days ago [-]
Okay, I'm not finding any benchmark that supports the idea of a slowdown, so probably I remembered some rumour I read on twitter or reddit.

What I am finding now is that python is working more than fine on M1, be it through rosetta or native code, which is consistent with your experience.

Also to clear things a bit, I parsed your "3.6 -> 3.9" as meaning that you had moved from Py36 on intel to Py39 on apple, which is why I asked if maybe there were speedups or differences from moving to 3.9.

X-Istence 1233 days ago [-]
Ah, my apologies for not making that more clear. Currently all are running under Rosetta.

I was unable to get all the versions I needed compiled natively, which means for now I will use the non-native versions.

I have tested Py3.9 compiled for arm64, and one particular test suite went from 9 seconds to 6 seconds. Other test suites were not as dramatic, especially larger ones, where it might go down from 40 seconds to 36 seconds.

Still a speed improvement, but not as dramatic as the 2017 MacBook Pro i7 -> 2020 MacBook Pro M1.

kossTKR 1234 days ago [-]
When do people think the 16" with a M chip will be available?

I'm honestly for the first time in a decade a bit excited about an upcoming computer, pretty great.

Still running on a 2014 MBP that honestly is still great so can wait a bit.

wulfklaue 1233 days ago [-]
> I'm honestly for the first time in a decade a bit excited about an upcoming computer,

I think we are more excited about a shake up in the whole computer world, that may mean the dead of ATX/MB's/etc as we know it. And a more profound move to eGPU/NUC/Laptop like powerful devices with a low power usage.

This feels a bit like a wakeup call to the industry. Like how the first (commercial) successful smartphone, the iPhone 1 fueled a entire industry for over 10 years to massive evolve. What is kind of funny because i felt like the smartphone evolution was starting to loos steam. The conversion between PC/Laptop/Smarthones will begin now.

thejohnconway 1233 days ago [-]
I don’t think this transition will take two years, the transition to Intel took less time than they said last time. Maybe for the Mac Pro.

The MacRumours buyers guide has the MacBook Pro 16” as due an update right now: https://buyersguide.macrumors.com/#MacBook_Pro_16

Honestly, I’d be surprised if we don’t get an M1x (or whatever it’s called) within six months.

pfranz 1234 days ago [-]
Nobody actually knows, but I remember Apple saying something like a 2 year transition (which includes new Intel products). So if you're not in a rush that's likely the longest you'll wait.
alwillis 1233 days ago [-]
I use vim to develop, and it runs on Rosetta just fine, so I can continue to use all my existing plugins.

Don't forget—Vim comes as a Universal binary, so it already runs natively on the M1.

    /usr/bin/vim: Mach-O universal binary with 2 architectures
    [x86_64:Mach-O 64-bit executable x86_64] [arm64e]
    /usr/bin/vim (for architecture x86_64): Mach-O 64-bit executable x86_64
    /usr/bin/vim (for architecture arm64e): Mach-O 64-bit executable arm64e
raverbashing 1234 days ago [-]
Good to know

I'm glad Vim is running fine on Rosetta, still, if you wanted to compile it from source, it really shouldn't be too hard.

(Rust is a harder problem, so I wouldn't worry too much, still, once the dependencies are downloaded and compiled it's usually fine)

saagarjha 1234 days ago [-]
No need to use Rosetta:

  $ file `which vim`
  /usr/bin/vim: Mach-O universal binary with 2 architectures: [x86_64:Mach-O 64-bit executable x86_64] [arm64e]
  /usr/bin/vim (for architecture x86_64): Mach-O 64-bit executable x86_64
  /usr/bin/vim (for architecture arm64e): Mach-O 64-bit executable arm64e
lmedinas 1234 days ago [-]
vim 8.2 included in MacOS is native, neovim is not yet ported :(
mlindner 1234 days ago [-]
NeoVIM works fine after adding a compilation target. See: https://github.com/neovim/neovim/pull/12624

It hasn't been merged yet as some of the dependencies aren't available, one of them apparently being LuaJIT, but if you build it yourself without that it shouldn't be a problem.

X-Istence 1234 days ago [-]
I install from Homebrew, specifically MacVim. The system version is usually behind on patch levels/plugins that can be loaded.
mlindner 1234 days ago [-]
Gluing yourself at the hip to homebrew is a bad idea IMO. It's trivial to install things yourself. It's just a matter of `./configure && make && make install` and you're done.
ibraheemdev 1234 days ago [-]
Why do a `git clone pkg && cd pkg && ./configure && make && make install` instead of just a `brew install`? The whole point of a package manager is to make installations and upgrades easier.
mlindner 1234 days ago [-]
Homebrew never was a package manager until recently. It was in the name, it was designed for building packages from source on your local machine.
X-Istence 1234 days ago [-]
Now your are just quibbling over what ultimately is a minor difference.
X-Istence 1234 days ago [-]
"just"

Also need to track down and install all of the dependencies, and the dependencies dependencies and all the way down, and figure out the right configure flags to pass, and what environment variables it needs and where it expects to look for header files and the like.

Doing all of that from scratch takes hours and hours. Homebrew lets me not worry about that, and instead be productive.

ant6n 1233 days ago [-]
It also takes care of updates
arcticbull 1234 days ago [-]
I'm doing my Rust development in a Docker container deployed on AWS w/Fargate and EFS to store my source, via VSCode Remote on my M1 MacBook Pro :) Thin client, arbitrarily fat server. VSCode Remote makes it a very pleasant experience. I'll document my experience once I've smoothed out the kinks.

The nice thing is I can just wait out the compatibility issues re: arm migration, especially with some of the third party crates my rust projects depend on currently failing to build.

aledalgrande 1234 days ago [-]
What is the monthly expense to run your dev on Fargate?
arcticbull 1233 days ago [-]
Just got it set up a few days ago, but happy to report back. I'll also write up a quick medium post or something.
X-Istence 1234 days ago [-]
Being able to easily install it using homebrew is really where that need came from. I can use the existing bottles and the recommendation for right now is to not mix and match both Intel/Arm binaries.

Also, the ARM version of Homebrew is not yet ready for primetime, there is a ton of software that doesn't compile for ARM yet (Including MacVim).

kzrdude 1234 days ago [-]
How is Rust support for the platform?
X-Istence 1234 days ago [-]
It's Tier 2... and I've been using the ARM toolchain that was just released.

So far all the crates that were not working have been getting updates to work on the new platform, so far I haven't found any issues!

throwaway201103 1234 days ago [-]
> Something that used to take 3 minutes to compile + link for a debug build is now down to 40 seconds.

1990s me laughs remembering edit/compile/test cycles measured in hours.

zeroonetwothree 1234 days ago [-]
Well it still can be hours. Really depends what you work on.
Aeolun 1234 days ago [-]
At that point, wouldn’t you buy a double Threadripper machine to speed it up? If you sit around for a few hours that is a lot of time wasted.
harrygeez 1234 days ago [-]
Well it still could be, depending on what you do
selestify 1234 days ago [-]
What were you compiling back then?
in3d 1234 days ago [-]
Why are you using a laptop for development in the first place unless you have to be on the move? Why not take advantage of even faster CPUs or GPUs, more RAM, many times more screen space, more disk space, ergonomic keyboard and mouse, or even real speakers if you work from home?
samb1729 1234 days ago [-]
In a similar vein: "Why are you using a Mac for development in the first place unless you have to develop software for Apple devices? Why not run Linux if everything you write runs on Linux servers?"

Your post reads less like a sincere question and more like "you should be doing it this way unless you can defend your position". Maybe--just maybe--people weigh up tradeoffs differently than you do personally.

FWIW 'on the move' can mean going into a meeting room or even just walking over to a coworker's desk with your computer. Desktops kinda suck when you work around humans who don't just stare at their screens with cans on all day.

coward8675309 1234 days ago [-]
You've touched on something that irks me to no end — the tendency of many Elite Coder Bros to say things like, "Didn't you see my enormous headphones on?! Can't you see I'm coding — don't you realize I think for a living?!"

Yeah, dude, we're all thinking. As a hands-on programmer-manager I can report that creating slides for the fundraising deck that'll raise money to pay you or writing that email about you taking bags of beef jerky home for your post-workout recovery meal can take as much sustained concentration as you spinning in your `while 1 { copy_compiler_error(); stack_overflow_it(); paste_code(); }` infinite loop.

spacedcowboy 1234 days ago [-]
I tend to get given large projects to do. I'm fairly senior, and once we've got past the brainstorming stage, the ideas have flowed forth, and the direction is, if not clear, then at least aspirational, I often get given the job of breaking ground.

Here's how I think.

I model interaction between distinct parts. I have a mental model of how X fits with Y, how X can affect Y, and how Y can in turn affect X. I'm not doing it with just X and Y, I painstakingly build this mental model[1] over as much of the problem space as I can, and having done this for many years now, I can cover a fair amount of ground before the complexity gets beyond my ability to model. It takes a while to create this, and then when some middle-management type wanders along, taps me on the shoulder and disturbs the concentration, and it all comes crashing down around me, I am less than best-pleased. Bonus points if it's just to "touch base" or "remember we have a meeting in 2 hours time", or ... you get the picture.

Why do I expend this enormous mental effort to gain such a fragile and ephemeral state ? Because I can mentally throw boundary conditions at it and "see" how things will react. It's how I deal with inherent complexity of large systems, and a couple of hours of mental effort can prevent me spending a week coding down a dead-end path. It's happened often enough now that even my line management understand it's worth the time - there's been plenty-a-meeting where I go in and say "yeah, I know we wanted to do <insert X> but I think there's a problem when Y and Z come into play under conditions A, B and C. I think <insert option gamma> is a better route even though we didn't think so at first".

Sometimes you really do just need to be able to be left alone and think. As someone who used to own the company before he sold it, and who's done pitches to VC's and other investors, I can quite categorically state that (for me), the slides, presentations, and client management is nowhere near the level of mental investment. Nowhere near.

Just my $0.02

---

[1] It's not visual, I have aphantasia, it's more firing-condition-based.

waheoo 1234 days ago [-]
My wife and I discus this all the time. Not all jobs require the same level of thought. Her work involves a lot of mechanical movement, practice, skill, talent, and some thinking. But most of her day to day can be done listening to a podcast like she's driving.

I can't do that.

So no,

We're not "all thinking", some tasks require deep thought and long periods of uninterrupted concentration.

The problem is working in a team requires communication, and management of interruption, it is not useful to be like this dick with the headphones on, that creates barriers (the attitude, not the headphones) to communication.

in3d 1234 days ago [-]
I do think it makes little sense and it’s a message board, sometimes things will be said directly. If a preference is not rational and has negative consequences then it’s worth challenging. Who knows, maybe somebody will realize that squinting at the text on a 13-in screen while looking down and having their hands in unnatural positions for many hours each day is not great for them. I never said that you can’t have a laptop to take with you. I’m talking about development like the OP was - where things like compile times matter and where you can very likely remotely access your desktop if you want to show things to others.
mypalmike 1234 days ago [-]
Much of the development world has moved to laptops, docks for ergonomic desktop use, and cloud compute for when beefier hardware is needed. As a developer, I haven't used a desktop machine either at work or at home in well over a decade.

This is all off topic in any event.

in3d 1234 days ago [-]
Sorry to hear that. I tried to use a top-of-the-line 2019 MacBook Pro for some development while on the road and it doesn’t come close to a desktop, even when it’s augmented with external monitors, etc.

“Compared to the desktop (sitting), tablet and laptop use resulted in increased neck flexion (mean difference tablet 16.92°, 95% CI 12.79-21.04; laptop 10.92, 7.86-13.97, P < 0.001) and shoulder elevation (right; tablet 10.29, 5.27-15.11; laptop 7.36, 3.72-11.01, P < 0.001).”

“These findings suggest that using a tablet or laptop may increase neck flexion, potentially increasing posture strain.”

https://pubmed.ncbi.nlm.nih.gov/30373975/

And it is on-topic enough for both the link and the OP.

mypalmike 1233 days ago [-]
You seemingly don't know what a docking station actually is.
woah 1234 days ago [-]
Ever want to code on the couch? Not everyone wants to spend all day sitting on a gaming chair in the man cave illuminated by the light of their LED desktop enclosure
in3d 1234 days ago [-]
See my other response. Remote login.
joshmanders 1234 days ago [-]
Literally everyone who chooses a laptop as their main driver is doing it purely for the portability... We don't choose it because it's small, we choose it because while most of the time we're stationary, the time we do need to be portable, its easier to just undock and walk away than to make sure your laptop and desktop are synced.
macintux 1234 days ago [-]
I was considering picking up a new Mini so I could drive two displays, but then I remembered another laptop advantage: TouchID. I would really miss that.
joshmanders 1234 days ago [-]
I drive two 27-inch 4K displays at 3840x2160 on my 2019 16-inch MacBook pro.
macintux 1234 days ago [-]
Ditto, but the M1 laptops officially only can do one display. People have hacked around it but I’d rather wait.
X-Istence 1234 days ago [-]
I have it driving a 49" UltraWide Curved Monitor... which replaced my two 27" monitors.

Might not be for everyone, but I only need it to drive a single external monitor.

waheoo 1234 days ago [-]
At what res/rate?
X-Istence 1234 days ago [-]
5120 by 1440 at 60 hertz.
threeseed 1234 days ago [-]
You can use DisplayLink to get around it.

I wouldn't class it as a hack since it is pretty standard in enterprise environments. Downside is that it does consume CPU which depending on your workload may/may not be noticeable.

joshmanders 1234 days ago [-]
Ah I didn't know that.
in3d 1234 days ago [-]
You can get it up to 3 4k monitors and the original laptop screen running together with this MacBook Pro. But it becomes laggy.
st3fan 1233 days ago [-]
The Mini also unlocks with your Apple Watch :-)
macintux 1232 days ago [-]
Good point. I don’t think it handles all of my use cases (e.g., passwords and credit card info in Safari) but I’d forgotten that there are several instances where my finger and my watch are in competition to see who unlocks something first.
in3d 1234 days ago [-]
You can remotely access your desktop if you’re spending most of the time stationary.
joshmanders 1234 days ago [-]
Or just get a laptop...
samb1729 1234 days ago [-]
No, no, no. You need a massively over-specced desktop running at all times along with the burden of maintaining two devices instead of just one. How could you be so blind to this fact?!

...

pkage 1234 days ago [-]
Not the original commentor, but it's nice to be able to pick up my laptop from my desk and move to sit in the kitchen, or the couch, or outside, or (in better days) a café. I really only barely use my desktop nowadays, and half the time it's over mosh.
coin 1234 days ago [-]
Every tech company that I’ve worked for in the last 10 years issues laptops to their SW engineers
waheoo 1234 days ago [-]
I bet you'll find a lot of Devs hate it.

I had a seriously nice mac mini taken off me once, got given a "new Mac Pro". Was huge downgrade, was not happy.

war1025 1234 days ago [-]
I have a ~10 year old desktop and a ~5 year old laptop (I think we're due for another upgrade cycle here soon).

We got docking stations for the laptops but I just couldn't get used to it, and my desktop was just sitting there still functioning happy as could be.

So I still use my desktop for actual work and my laptop is basically a very over-powered video conferencing machine.

throwaway45349 1234 days ago [-]
Don't most companies also provide workstations at desks for engineers? I know many who only use their laptop for meetings (until WFH) and it always seemed like a huge waste.
in3d 1234 days ago [-]
Right. I bet many of them also love open-floor plans despite research showing that they reduce the number of meaningful interactions by 70%.
X-Istence 1234 days ago [-]
Because I enjoy working from my home office hooked up to my 49" UltraWide, but I also enjoy working from a coffee shop, or from my couch, or from my bed, or when I go spend time with friends and do amateur photography I like having my laptop with me to do on the fly editing/reviewing of images before finishing a shoot...

Portability.

waheoo 1234 days ago [-]
I wish I could do this but my laptops would always struggle to run well driving dual 4k monitors.

I think next go round I'm going to get a desktop replacer style laptop and see if I can get best of both worlds.

I don't really care about portability and thus use a desktop but I do want to sit outside or work from the couch sometimes.

charrondev 1233 days ago [-]
The MacBook pros have been able to drive 2 4K displays at 60hz since at least 2016 (when they became thunderbolt 3).

I don’t recall if that worked on the smaller models, but it worked on my 15” model. I’ve been using 2 Dell P2715Qs since then and they work great.

Rillen 1233 days ago [-]
Working from home on a friday is probably the biggest reason for a lot of people i guess.

Meetings, is another one.

Doing presentations.

Also for me at least: On-Call.

I would prefer a workstation at home and a lightweight laptop with company network access but hey my companie prefers to buy me a Laptop for 3k over a desktop for 1k and a light laptop for 1k.

What can i do? yeah nothing :)

sally1620 1233 days ago [-]
In most companies, a laptop is a standard issue. In my company, you are given a laptop on 1st day, but you have to put in a request for a desktop.

It also greatly simplifies working from home. Not just for portability.

hmottestad 1234 days ago [-]
Personally can’t wait to upgrade. The point about 16GB being on the low side was nice to get confirmed. There has been a lot of conspiracy on 16GB being plenty enough, but as a developer I know that to future proof my next machine I would really like 64GB. I already have 16GB of ram on my 8 year old MacBook Pro.
samcheng 1234 days ago [-]
I suspect there are a lot of us in that boat, with 5-to-8-year-old Macbook Pros. Apple really missed the mark with the intervening generation, by removing useful features and replacing them with features unattractive to developers.

Maybe this big leap in processor / thermal / battery life (and a revision back to proper keyboards) will be enough to get us to switch! I think 64GB (or at least 32GB) would be a nice sweetener.

DrBazza 1234 days ago [-]
I'm still running a late 2013 Macbook, now on Big Sur. The battery life is now about 3-4 hours even if I'm doing dev work, but otherwise it's still pretty good otherwise, for a 7 year old laptop. I had a Sony Vaio back in the early 00s, and it started with running XP 'ok', and then... Vista, which made it dog slow. My point being that I'm surprised that a 7 year old laptop is still "good".

A new laptop isn't critical to my work at the moment, so I'm happy to wait for an M2, if that's what it will be called. Looking at Apple timelines a new chip tends to be 1-2 years apart, and I'm fine with that.

I'm also reluctant to give up Magsafe, or at least have to buy a magnetic USB C connector.

Tagbert 1234 days ago [-]
With Apple moving forward rapidly on this architecture change, I expect them to upgrade the processors at a faster pace than normal. They are 6 months into a 2 year plan to migrate all of their systems. I would expect the next version of M chips in 6 to 12 months with higher RAM limits and faster GPU. That would be targeted at their other laptops and the iMac. It will likely take longer for them to get to something that can be used in the Mac Pro and maybe an iMac Pro where RAM and GPU needs are even greater.
michaelmrose 1234 days ago [-]
I wonder if they will simply provide a motherboard with more cpu slots for M1 v2 for higher end machines like iMac/iMac Pro/Mac Pro.
glangdale 1234 days ago [-]
It seems unlikely. There's a lot of extra complexity with multi-socket and there's no actual reason for them to do this. I'm not a CPU designer but they might do the equivalent of what you said with a chiplet design like AMD (Intel also has announced plans to eventually get to chiplets).
potemkinhr 1234 days ago [-]
The ages old AMD philosophy, shovel more cores at it? With such low TDP it might work
samcheng 1234 days ago [-]
I love Magsafe as well. However, if the new laptop battery truly lasts all day, then it won't be nearly as important. I'll just charge the laptop overnight, and treat it like my phone.

Maybe they will integrate wireless charging into the laptops soon?

billti 1234 days ago [-]
I would agree, with the exception that I’ve noticed really CPU heavy things like builds that take > 1 min still complete a lot faster when plugged in, even on a full battery. I assume they still need to throttle somewhat when only running on battery.
macintux 1234 days ago [-]
Air? Or do you have a fan?
trynewideas 1234 days ago [-]
It's not throttling, per se; the M1 prioritizes its efficiency cores when off mains power, and the performance cores when on mains power.

When unplugged, it engages the performance cores as little as possible. You should be able to run `powermetrics` on an interval while running a high-intensity task to confirm; the performance cores pin first when on mains, then ramp up the efficiency cores, and the opposite happens on battery.

It's a big difference because the efficiency cores max out at about 2W draw, and the performance cores max out at around 18W. There's a big incentive in battery life to rely on that behavior.

I'm not sure yet if it's tunable, macOS doesn't make that a transparent behavior where you can just tweak it at will.

The Air will throttle if under full load long enough, but it's still mostly just ramping down the performance cores to stay cool.

Obi_Juan_Kenobi 1234 days ago [-]
Magsafe seems less an issue now.

With a true 'all day' battery, you'll probably only plug it in at a desk where tripping or knocking things over isn't an issue. While travelling, you just charge it at night or when you aren't using it. Much more like smartphone usage.

How many people are aching for a magsafe phone charger?

ldrndll 1234 days ago [-]
If it’s any consolation the battery life on my 6 month old 16” MBP is also 3-4 hours when I’m doing work. The CPU is an absolute dog, and the fans kick in under the faintest of loads.
desiarnezjr 1234 days ago [-]
Same battery life here. :( Love the actual machine but the battery life is a disappointment.

Just got a 16MB 500GB M1 Air and so far everything is at least as snappy as the 16" Pro, and battery looks like a full 8 hour day + so far... it's a keeper I think. I also find the keyboard a bit nicer to type on personally than on the 16", so accuracy is up.

square_usual 1234 days ago [-]
Moving from a 2015 Air to a 2018 Pro felt like a downgrade since the battery life was much, much worse. I have to keep the power plug around if I want to work on the couch for any amount of time.
dragonwriter 1234 days ago [-]
Isn't the Pro line optimized for power, while the Air line is optimized for mobile utility (incl. battery life)?
Rillen 1233 days ago [-]
Thats crazy; Are you using it as your daily primarly work horse?

I can't imagine how you got used to the performance you have right now. Either you are not doing anything with your device or you would hugely benefit of upgrading at least every 3 years or so.

z6 1234 days ago [-]
Yeah, maybe. It's still hard for me to envision giving up a 2014 and 2015 MBP. They both work flawlessly and don't compromise on any of the useful features, like having USB A ports, SD card slot, MagSafe, no touchbar, reasonably sized touchpad and so forth.
novok 1234 days ago [-]
My plan is to get a min spec M1 air as a personal laptop and then get the real 14"/16" M2 version with touch, more cores and more RAM later on. As one apple employee told me, you never really want to get a v1 with apple anything if you can.

With the 'touchification' of big sur, it seems pretty likely apple is going to release some sort of convertible macbook or maybe something iPad style. Or maybe iPads can start running macos?

Work laptop will get upgraded to an M1 one way or another anyway.

alphabettsy 1234 days ago [-]
I keep seeing this, but the performance gains and real-world, Docker and all, battery life are significant in the 16 inch MacBook Pro‘s more so than even my 2017 15”.
smnrchrds 1234 days ago [-]
I don't even want to imagine how expensive the 64 GB version is going to be.
1234 days ago [-]
claudeganon 1234 days ago [-]
I upgraded to a high spec MacBook Pro 16 after the M1 model was announced. I need solid virtualization support for Windows and Linux now, plus Big Sur seems kind of a mess from the design and privacy side of things. I'll probably wait 3-5 generations until everything with Apple Silicon is worked out to upgrade again, maybe get an Air as a device for travel if things look stable sooner.
woahAcademia 1234 days ago [-]
That's pretty much how I feel too.

Nothing enticing because my computer is lightning fast and this means having to use the Apple ecosystem.

It's almost like Apple has a culture of forcing people to do things the hard way, just so it feels rewarding when it's done. Gamifying the experience?

heisenbergs 1234 days ago [-]
The reason it has been stuck to 16GB is precisely because Intel mobile chips did not support any more than that up until a year ago. So rather than blaming Apple, it should rather be placed on intel.

I don't think expecting more than 16GB of ram for entry level Apple laptops should be expected.

The low amount of ram, energy inefficiency and plateauing performance are the exact reason why Apple is moving away from Intel and onto a platform where they can control their own destiny.

Tuna-Fish 1234 days ago [-]
The actual reason the M1 Macs cap out at 16GB is that they use LPDDR4, and that on a 128-bit bus caps out at 16GB. Assuming that they move to LPDRR5 as it becomes available (hasn't Apple generally been pretty quick on the uptake on new RAM standards?), that limit goes up to 64GB.
DCKing 1234 days ago [-]
It's a conservative spec for sure given the rest of the M1. Samsung has shipped their own LPDDR5 in the Galaxy S20 since February. Maybe production capacity is still too limited.
Tuna-Fish 1234 days ago [-]
I believe that Samsung co-designed the memory with their interface block, and then it was effectively codified as the standard. Most other companies had to wait for the standard to be formed before they could start work on their IP. This gave Samsung a huge head start.
ustolemyname 1234 days ago [-]
Intel's fault? Uhm, they've had some misses recently, but your statement about RAM support is incorrect.

The 7200u supports 32GB or RAM. That chip is over 4 years old and nothing special. Not sure how long that's been supported, I didn't feel like looking further back.

https://ark.intel.com/content/www/us/en/ark/products/95443/i...

X-Istence 1234 days ago [-]
It does not support LPDDR4... which is something Apple has been targeting because it uses less power.
rgbrenner 1234 days ago [-]
The just released 16" macbook pro has DDR4. They clearly aren't married to LPDDR.
X-Istence 1234 days ago [-]
Probably because they couldn't keep waiting. It also means the 16" MacBook Pro goes through battery life at high speed from experience.
AzN1337c0d3r 1234 days ago [-]
Apple switched to DDR4 when the 2018 Macbook Pros were released which allowed them to support 32 GB of RAM.

The rated battery life of the Macbook Pro didn't change between 2017 and 2018.

X-Istence 1234 days ago [-]
The rated battery life may not have, but the actual, real world experience battery life certainly has. Speaking as the user of a. 2019 MacBook Pro with the DDR4 that I use for work.

I find myself scrambling for a power supply way more often than I ever did with my 2017 15" MacBook Pro.

Even with the older battery in the 2017, with more cycles, I get a longer battery life out of it than the 2019.

AzN1337c0d3r 1232 days ago [-]
And I have the complete opposite experience. Pinning this on the switch from LPDDR3 to DDR4 seems like a very weak conclusion. Especially when it is known that processors and displays are the largest consumers of energy and the 2019 MacBook Pro boosts way higher with more cores than the 2017.
ustolemyname 1234 days ago [-]
Huh, so they did. TIL.
rgbrenner 1234 days ago [-]
Apple sells the 16" macbook with 64gb of DDR4. It's their decision whether to use LPDDR or DDR. They choose LPDDR for some and DDR for others.

Pretty incredible to blame another company for a decision that was made inside Apple. Especially for a chip that was designed from scratch in apple, and therefore has no Intel constraints.

ffggvv 1234 days ago [-]
what do you mean entry level laptops? they don’t offer more than 16 and their base ram is only 8...

and how are the macbook pro or mac mini entry level laptops?

djrogers 1234 days ago [-]
The mini is their cheapest computer. The Air has been their entry level laptop for the last several generations. The 13” Pro has had 2 versions since 2016 - a low-end 2 port model, and a higher end 4 port model.

Every device with an M1 available today is Apple’s lowest-end (and least expensive) offering.

krzyk 1234 days ago [-]
> The reason it has been stuck to 16GB is precisely because Intel mobile chips did not support any more than that up until a year ago. So rather than blaming Apple, it should rather be placed on intel.

That is not true, example CPU I had in my old laptop: https://ark.intel.com/content/www/pl/pl/ark/products/78939/i...

Supports 32GB RAM, released in 2014.

The reason some older laptops didn't have more RAM was twofold: 1. Either the BIOS wasn't written with such support 2. There were no RAM sticks big enough (no 16GB in a single SO-DIMM) so you could have 2x16 - but there are currently since a year or two, SO-DIMMs with 32GB RAM, so you could easily reach 64MB in a laptop.

npunt 1234 days ago [-]
GP means LPDDR3 support which capped out at 16gb on intel, which was because their 10nm lines with LPDDR4 were delayed years. It was always possible to use regular DDR but for laptops the low power variant is important.
FireBeyond 1234 days ago [-]
... and yet they use DDR4 (not LP) on at least one of their laptop lines.
npunt 1234 days ago [-]
Yes they do on the 16”, after years of complaints from users who didn’t understand the constraint, and which they added the maximum allowable battery size to (~100wH), and it still only lasts 3-4h in moderate use.
AzN1337c0d3r 1234 days ago [-]
They switched it on the 2018 15-inch and it didn't change the battery life at all.
saagarjha 1234 days ago [-]
Seemingly out of necessity?
krzyk 1233 days ago [-]
GP didn't write about LPDDR3, just a vague statement that intel cpus didn't support > 16GB RAM, which I proved was wrong.
Toutouxc 1234 days ago [-]
I don't know. I did what I'm currently doing (Ruby on Rails dev) on a 4 GB MacBook Pro until last year and it was fine.

At the moment I have around 2 GB free on my 8 GB Linux machine with a webserver, MariaDB, a JetBrains IDE, Slack and Firefox running. It would be easy and dirt cheap to increase the RAM to 16 or 32 GB but I'm too lazy to even order it and open the machine up since I have never felt the need to have more.

I'm just saying this so that people like me remember that they're still professionals and their work is valuable even if they don't need 64 GB of RAM.

lumost 1234 days ago [-]
IDEs such as IntelliJ can easily consume 8GB of memory on a moderately sized project of 100k loc. If one has to open multiple such projects or work on a very large project in the million loc scope then you’ll run out of 16 GB really fast.

Folks working with data intensive applications often need to trade off writing code to page data in and out of memory during development with larger dev boxes.

Jtsummers 1234 days ago [-]
100k LOC would be, let's be generous, about 8 million characters (80/line average, which is high). 8GB of RAM for 8MB of input seems excessive to me. What's going on with that IDE that it needs so much for something so small? Even if that were 8 million distinct tokens/symbols in the input, that's still 1KB/token of memory consumed.
CapsAdmin 1234 days ago [-]
I would assume code analysis.

8gb still sounds a bit excessive but given how long java stack traces can be maybe it's not so strange.

aledalgrande 1234 days ago [-]
Except that is only what an IDE does at the very minimum. An IDE loads plugins, provides intellisense, documentation, potentially loads documentation of all the libraries, watches files, displays Git status, parses your code for syntax highlighting etc. etc.

You're not opening a TXT file with 100K lines in it.

z3t4 1234 days ago [-]
For every key stroke a new immutable string is created with the content of the file/state. Plus data structure for AST/static analysis of every file in the project including dependencies. Plus another copy of the file for rendering with syntax highlight. One would think all this copying of data is very inefficient, but computers are good at moving data around. The most costly is rendering the glyphs.
wulfklaue 1233 days ago [-]
> IDEs such as IntelliJ can easily consume 8GB of memory

Sound more like a Jetbrain ( IntelliJ company ) problem. Used to develop on PHPstorm ( another Jetbrain ) product ( same software in the background ) and the slow startup's, 2GB+ memory eating for relative small projects, crashes way too much, expensive as hell for what it does ( especially with the license change years ago ) ...

Eventually switched to Visual Studio Code and while people whine how Electron is memory inefficient, its like 1/6 the memory usage of PHPstorm, with only a few features missing.

Even on a 32GB PC system, i barely use 16GB ( and WSL2 is eating up a lot with a lot of docker images. And its not really docker, just Linux cache eating up memory ). Not a issue on a Mac that does not need a VM like layer.

Its about priorities sometimes. If people keep upgrading their memory, developers/companies simply push the responsibility to the clients and do not bother with spending time on optimizing.

If people start leaving software product on the way side for being just horrible inefficient messes, then maybe a bit of focus will come back to actually optimizing products! You will see Jetbrain change tune, when VSC etc keep eating its market share.

fayten 1234 days ago [-]
I have worked on a large 10 year old Spring application with well over 300k loc for 3 years. I don't recall Intelij ever exceeding 3gb of ram. Intelij reminded snappy on my Macbook Pro 2015 with 16gb of ram and an i7.
CapriciousCptl 1234 days ago [-]
Same experience here. My Ubuntu laptop accidentally doubled-down on 16gb RAM because I forgot to give it a swap partition (whoops). It hangs maybe once a month under a VS Code, PostgreSQL, FastAPI/Python, React workload. Basically, not annoyingly frequently enough for a lazy person like me to change the partition table. The same workload on my 7 year old Ubuntu desktop with a swap partition on a fast SSD very rarely ever hangs.

Certainly some devs need more than 16gb, but they know who they are. And you know who they are because, well, they love telling you :).

CraigJPerry 1234 days ago [-]
You could create a swap file rather than a dedicated partition:

    $ dd if=/dev/zero of=/swap bs=1M count=8192
    $ mkswap /swap
    $ swapon /swap
    $ echo "/swap swap swap defaults 0 0" >> /etc/fstab
1234 days ago [-]
laurencerowe 1234 days ago [-]
You don’t need to reparation, just configure it to use a swap file. If you set swappiness to zero it will only use it when out of ram.
barkingcat 1234 days ago [-]
you can always add a swap file ... that makes your life easier so you don't have hangs, and you don't need to repartition.

I use swap files all the time with things like raspberry pi's and such because I don't want to mess with the sd card partitions, but still from time to time it's good to have that swap fallback since the memory onboard is limited.

Swap files are super easy to set up, take up no "brainpower" and disk space is cheap these days.

matwood 1234 days ago [-]
I think it really depends on if your development workflow requires VMs/Docker or not. It seems crazy to me to have a bunch of containers running using 4-8gb each, but for many that is how they develop.
e12e 1234 days ago [-]
Curious - why intellij for ruby on rails? (I'm quite happy with (neo) vim (qt) and a slew of (mostly) tpope's plug-ins.
leipert 1234 days ago [-]
Not GP, but RubyMine (the Ruby IDE from JetBrains) has decent support for refactoring things, type interference, executing single tests with a debugger, etc.
notreallytrue 1234 days ago [-]
The system is swapping a lot but the SSD is giving you the impression that 8GB are enough
Wowfunhappy 1234 days ago [-]
> but the SSD is giving you the impression that 8GB are enough

If it works, is it impression or reality?

robgibbons 1234 days ago [-]
Swap isn't great for SSD lifespan. Probably not a drastic loss, but nonetheless not helping its longevity.
notreallytrue 1234 days ago [-]
Impression is in the realm of reality, it's simply deceptive

Jetbrains that fits in 8GB of memory with slack and Firefox leaving 2GB free it's not happening

I used to code on 800×600 monitors professionally

If I say today that they are more than enough to work, I expect to be taken to a mental institution

spyridonas 1234 days ago [-]
Until the SSD dies. It's recommend to disable swap if you have an ssd
rsynnott 1234 days ago [-]
I mean, it depends what you do. I have 16GB on my personal laptop, and essentially never swap, but the 16GB on my work machine is painful for large Scala projects.

As always, for some people 16 GB will be fine, for some it won’t.

wsc981 1234 days ago [-]
8 GB when using Android emulator and Xamarin has been a pain for me, but my iMac doesn't have a SSD drive (it's got an Apple Fusion drive). The machine was swapping all the time, e.g. when alt-tabbing between Android emulator and Xamarin.

After upgrading to 24 GB the machine became very usable.

I wouldn't ever want to go back to 8 GB. Perhaps 16 GB might be usable for iOS and Xamarin development, especially if the Android emulator can run in some kind of HAXM [0] mode, but for ARM processors ... not sure if that's possible right now. I'm sure the SSD also makes a big difference compared to my Fusion drive right now.

Still it feels safer (more future-proof) if the machine has a bit more memory than 16 GB.

---

[0]: https://github.com/intel/haxm/wiki/Installation-Instructions...

jmnicolas 1234 days ago [-]
FWIW on Windows a simple Flutter project in Android studio + the emulator + a couple browsers open and I'm hovering around 14 to 16 GB of ram.

There's no way I'm buying a 16 GB computer now, 32 GB is a minimum to be future proof.

megablast 1234 days ago [-]
An ssd would have made a bigger difference. And xamarin?? I’m so glad I don’t have to use that anymore.
wsc981 1234 days ago [-]
Honestly Xamarin is pretty nice these days if ones’ aim is to share a lot of code between iOS and Android projects while still being able to provide a native experience.
melling 1234 days ago [-]
I bought a 16GB MacBook Pro in 2013. I can’t imagine that I wouldn’t want more in 2021. I might have my next laptop until 2028.

As always, more RAM extends the usefulness of a computer.

Abishek_Muthian 1234 days ago [-]
If Virtual Machines are essential part of the workflow then 16GB is out of question IMO, could get away with Tiny Core Linux/Lubuntu on a VM to some extent but when getting some serious work done on both host and guest; 16GB would be a bottleneck for VM.
cataphract 1234 days ago [-]
64 GB is already the minimum for building big projects like V8 on 16 cores (e.g. on the 3950X)
aledalgrande 1234 days ago [-]
I'm curious about this, because I read some posts e.g. https://medium.com/computed-comparisons/garbage-collection-v... that say memory management and bandwidth are so fast, you actually need a lot less memory than on Intel.

Edit: not my opinion, just what I found on the internets, not sure why the downvotes if it invites discussion.

fulafel 1234 days ago [-]
That article seems confused and/or misinformed on many points, on many levels.

As for the GP cited use case, there's no plausible way in which the compilers involved in the V8 build process would start using less memory due to their memory management machinery getting magically transformed by the hardware from tracing GC to refcounting GC, even assuming LLVM had been using tracing GC in the first place.

(Also the tracing GC memory overhead claims originating in Apple's marketing copy are way hyperbolic, modern tracing GCs don't require double storage used in "mark and sweep" GCs)

aledalgrande 1234 days ago [-]
Makes sense yeah! I guess anything non native would work like that.
fulafel 1234 days ago [-]
Many memory hungry apps on Apple - Web browsers, compilers, photo/video editing, games, dev tools like vscode/electron/Jave IDEs/emacs/databases - are in this category.
steipete 1234 days ago [-]
I’ve read the reports but I’m not seeing this. Normal workflow and already over 11 GB that swaps.
chongli 1234 days ago [-]
I think the point is that memory bandwidth and SSD bandwidth and latency have improved so much that M1-based Macs are really fast at swapping. Combine that with compression (which macOS has done for a long time) and I could imagine iPadOS-like performance.

For normal users (such as people not compiling V8), swapping may be virtually unnoticeable.

kllrnohj 1234 days ago [-]
Macs have had fast SSDs for a while and as you say compression has also been done for a long time. M1 made no difference whatsoever to memory usage. The posts similar to above seem to just be an attempt to fanfic-justify what's an actual regression for some.

For many people 16GB is absolutely enough, and Apple had to start somewhere. But if 16GB wasn't enough for you yesterday, it isn't enough for you now.

chongli 1234 days ago [-]
The reviews of the M1 MacBook Air and Pro show a dramatic increase in SSD read/write speeds over the 2020 Intel models.
kllrnohj 1234 days ago [-]
Do you have a link? MKBD just said it was "slightly faster" than his comparison system and the numbers for the M1 appear to be around 3GB/s reads and 2.7GB/s writes in disk speed test, which is pretty typical of a last generation NVME drive (current pcie gen4 drives are hitting 7GB/s reads & 5GB/s writes)
chongli 1233 days ago [-]
which is pretty typical of a last generation NVME drive

Those numbers are about 2-3x as fast as what my 2020 Intel-based MacBook Air gets.

kllrnohj 1233 days ago [-]
Then sounds like Apple put some really garbage SSDs in the 2020 Intel-based Air, not sure what you're looking for here?

But 2-3x won't change swap from grinding to a halt to perfectly smooth, either. It's still over 20x slower than RAM.

More significant for swap usage though is random reads and read latency, neither of which are going to be particularly impressive on the M1's SSDs. You need something like Optane to make that a compelling case.

foldr 1234 days ago [-]
It's possible that M1 has enabled the use of better memory compression algorithms.
rsynnott 1234 days ago [-]
I do wonder if it’s taking advantage of the cheap CPU power and memory bandwidth to be much more aggressive about memory compression. That could reduce swapping for some workloads, though probably not a v8 build.
shawnz 1234 days ago [-]
What do you mean exactly? It is normal for swapping to happen even if you are using less than the physical memory, so as to trade rarely-used RAM for frequently-used disk cache which improves overall throughput
aledalgrande 1234 days ago [-]
Thanks for reporting! Bummer, for me I think I will have to wait for an M2.
jcelerier 1234 days ago [-]
> There has been a lot of conspiracy on 16GB being plenty enough

I bought a computer with 16GB in 2011 and around 2015 it was already lagging quite a bit due to swap - switching to 64GB in 2016 was day and night, I don't have to fear launching arbitrarily many jobs anymore.

beowulfey 1234 days ago [-]
RAM speeds have also increased quite a lot since 2011. Do you find yourself hitting 64GB of memory use often?
jcelerier 1234 days ago [-]
not 64, but I'm regularly hovering in the 40-45 gigs pool
square_usual 1234 days ago [-]
That seems like a very specific use case. I've had 8GB until 2016 and 16GB since and I've never had to deal with that much memory use.
AzN1337c0d3r 1234 days ago [-]
Anyone who runs a couple of VMs or a ton of docker containers is going to run into memory problems.

And that's not an uncommon use-case amongst the developer crowd which has a lot of overlap with HN.

1234 days ago [-]
izacus 1234 days ago [-]
> Xcode runs FAST on the M1. Compiling the PSPDFKit PDF SDK (debug, arm64) can almost compete with the fastest Intel-based MacBook Pro Apple offers to date, with 8:49 min vs 7:31 min. For comparison, my Hackintosh builds the same in less than 5 minutes.

This is telling just about how unrealistic Geekbench benchmarks, used by most reviewers, really are - they were all forecasting M1 completely humiliating everything from Intel and yet in practice it's not quite right. It's a dang impressive chip still though.

dschu 1234 days ago [-]
Yeah, but the MBA is doing all that with just 10W (15W burst), while your hackintosh most likely consumes 400W.

Not really comparable though.

ianhowson 1234 days ago [-]
I'm happy to pay for the electricity if I get faster builds.

My daily driver is a Hackintosh and with the CPUs pegged pulls about 70W from the wall. The two displays add another 90W.

M1 is impressively efficient but there's still a gap for fast, no-compromise workstations.

kevindong 1234 days ago [-]
On a desktop, power consumption doesn't really matter.

But on a laptop, it matters a ton. Personally I prefer doing work from my couch rather than my desk; ergonomics and external monitors be damned.

jmnicolas 1234 days ago [-]
I don't know how old are you but you might regret that later.

Around 40 years old is usually when you start to pay for carelessness with your body. Ask me how I know ...

bori5 1234 days ago [-]
100%, everything’s easy in your 20s! ;)
jdshaffer 1234 days ago [-]
Except finances. :-)
1_player 1234 days ago [-]
Sounds like you need a better desk/chair.
threeseed 1234 days ago [-]
> M1 is impressively efficient but there's still a gap for fast, no-compromise workstations.

Apple has already said that they are releasing a workstation-class M1X/M2 next year.

The point is that the M1 is already comparable to many workstations.

izacus 1234 days ago [-]
And yet everyone keeps comparing those and claiming how much faster Air is supposed to be based on a very funny benchmark.

Not sure how power use actually matters when I'm sitting and waiting for things to compile several times a day.

sudosysgen 1234 days ago [-]
Sure, but his Hackintosh isn't running a laptop chip. 15W X86 CPUs would probably compile that faster than the M1 if the difference from Intel is that small.
postalrat 1234 days ago [-]
Maybe 400W burst with a fast graphics card. Do we have actual measurements from the MBA yet?
djrogers 1234 days ago [-]
This is one data point among many. Tons of devs have been tweeting their compile times, and in many/most cases the M1 spanks the best Intel Mac out there.

If anything, then numbers in TFA are an outlier rather than a contradiction.

selsta 1234 days ago [-]
Single core vs multi core.

The top Intel MacBook has 8 cores and only beats the M1 (4 performance cores) by 1 minute.

The desktop Hackintosh most likely has way more cores, expecting the low power M1 to beat it is unrealistic.

ianhowson 1234 days ago [-]
I hate the "M1 vs Intel MacBook" comparison. Every Intel MacBook back to 2016 has broken thermals. They're all running at maybe half their rated clock speed. 13" MBP is a 4GHz part running at 1.4GHz. 16" MBP is a 4.8GHz part throttled to 2.3GHz. You're comparing M1 vs. a broken design which Apple broke.

Don't congratulate Apple for failing to ship trash.

There's an argument for efficiency on a laptop, no doubt, but that's not what the parent commenter is talking about.

M1 is the highest perf-per-watt CPU today, no question. Ignoring efficiency, there are plenty of faster CPUs both for single-core and multi-core tasks. That's what "my Hackintosh did the build in 5 minutes" is showing.

faxfax 1234 days ago [-]
You're misunderstanding Intel's specs. If you want the chip to run within TDP you can only expect the base frequency across all cores, not the ridiculous turbo frequency. The best laptop chip Intel has right now is the i9-10980HK with 8 cores at a 2.4GHz base frequency and a 45W TDP. Apple's laptops are more than capable of dissipating the rated TDP and hitting the base frequencies (and often quite a bit higher), although the fans can be a bit loud. So Apple's designs are not broken, at least not by Intel's definition.

You can relax the power limits and try to clock it closer to the 5.3GHz turbo frequency. But how much power do you need? I can't find numbers specifically for the i9-10980HK, but it seems like the desktop i9-9900K needs over 160 watts [1] to hit a mere 4.7GHz across all cores, measured at the CPU package (ie. not including VRM losses). Overall system power would be in excess of 200 watts, perhaps 300 watts with a GPU. Good luck cooling that in a laptop unless it's 2 inches thick or has fans that sound like a jet engine.

[1] https://www.anandtech.com/show/13400/intel-9th-gen-core-i9-9...

ianhowson 1233 days ago [-]
You've got it backwards. Apple chooses the TDP. Intel provides the CPU to suit. Apple is choosing TDPs which are too small and then providing thermal solutions which only just meet that spec. They could provide better thermals without hurting anything else in the machine and get a higher base clock.

I assume they do this for market segmentation; see 2016 Touch Bar vs. non-Touch-Bar Pro. One fan vs. two.

The TDPs look appropriate for M1 parts. They're too small for Intel. I'm guessing that (a) Apple predicted the M1 transition sooner and (b) Apple designed ahead for Intel's roadmap (perf at reduced TDP) which never eventuated.

So, unfortunately, Apple have shipped a generation of laptops with inadequate cooling.

read_if_gay_ 1234 days ago [-]
> very Intel MacBook back to 2016 has broken thermals. They're all running at maybe half their rated clock speed.

Are you saying it's an unfair comparison? The Intel Macs are operating in the same environment as the M1 Macs. It doesn't matter if the Intel parts could be faster in theory, because you're still dealing with battery and size constraints. If you want unthrottled Intel CPU in a laptop, your only options are 6 pound, 2 inch thick gaming laptops with 30 minutes of battery life. Now comparing that (or worse, a desktop) to M1 is unfair.

mbell 1234 days ago [-]
> 13" MBP is a 4GHz part running at 1.4GHz. 16" MBP is a 4.8GHz part throttled to 2.3GHz.

Apple's thermal solutions could be better, but they are designed within Intel's power envelope specs. e.g. The i9-9880H in the current 16" MBP is only rated for 2.3Ghz with all cores active at its 45W TDP. The i9-9880H is a 2.3Ghz @ 45W part that can burst up to 4.8Ghz for short periods, not the other way around.

kalleboo 1234 days ago [-]
Is there any reason the believe the M1 Macs don't also have broken thermals? I mean the Air doesn't even have a fan! Can't get more "broken" than that
andrekandre 1234 days ago [-]
what could they have done to improve the thermals?

is it just a matter of making a thicker laptop?

trynewideas 1234 days ago [-]
That's one of my biggest sources of skepticism with the M1 on the long-term — in lieu of improving thermal management, they instead reinvented _everything_ to generate less heat. Which is great! The current state of thermal management at Apple will work great at low TDPs, but they've procrastinated instead of improving. If they don't ever learn how to handle heat, this arch will still have a hard ceiling.

There's nothing in M1 that indicates that Apple learned how to improve thermal management, but lots to indicate that they'd still rather make thinner/lighter devices that compromise on repair, expansion, or sustained high-end performance — the even deeper RAM integration, offering binned parts as the lower-end "budget" option instead of a designed solution, or offering Thunderbolt 3, fewer PCIe lanes, and a lower RAM cap as being enough for a MBP or Mini.

adamnemecek 1234 days ago [-]
They were comparing m1 vs the intels that are in MacBook Pros, not any intel.
mcintyre1994 1234 days ago [-]
FWIW I’ve seen iOS developers compare the M1 laptops favourably to $5000+ iMac Pro’s as well.
izacus 1234 days ago [-]
Which is why it's such a horribly meaningless comparison - the marketing (and reviewers assisting it) blasted out comparisons with Intels obsolete technology, hamstrung by awful thermals. And then proclaimed it generally faster.
czzr 1234 days ago [-]
Which laptop should it be compared to?
macintux 1234 days ago [-]
It would be interesting, though, to see what those numbers would look like with a fan. Anything that runs that long on an Air is going to start throttling.
wmf 1234 days ago [-]
The new Mac mini is faster but not by much. The M1 just can't use much power.
izacus 1234 days ago [-]
Yeah, seeing numbers from either the Mini or the Pro might be more interesting.
Aperocky 1234 days ago [-]
You can't compare M1 to a 95W TDP chip, that's not really fair.
izacus 1234 days ago [-]
And yet Apple on their event and every single reviewer out there still did it.
Aperocky 1234 days ago [-]
Well, I guess I phrased it badly. I mean - there are no fair comparisons, but being in the comparison itself already showed how crazy powerful M1 is.
gst 1234 days ago [-]
> IntelliJ is working on porting the JetBrains Runtime to Apple Silicon. The apps currently work through Rosetta 2, however building via Gradle is extremely slow. Gradle creates code at runtime, which seems a particular bad combination with the Rosetta 2 ahead-of-time translation logic.

Java does not need to be run through emulation. Azul already published ARM64 Zulu builds for OpenJDK (here: https://www.azul.com/downloads/zulu-community/?package=jdk) and they work great on the M1. I'm currently using IntelliJ running in emulation mode (an official release for the M1 was originally expected by the end of November) and but build/run my projects on a local Zulu JVM with ARM64 support.

the_only_law 1234 days ago [-]
Not sure why people are so worried about developers getting trapped by locked down systems. I have plenty of other PC’s, running Linux, *BSD, Windows much less locked down. I’m very aware Apple is controlling of their platforms. No I’m not thrilled about certain aspects of the M1 platforms, but most platforms I’ve used feel locked down in some shape or form. If, in the case Apple tries some of the outlandish theories out there, I’ll simply cease to use it and probably trash Apple then as it will be useless to me, but I’m not locked into anything.
floatingatoll 1234 days ago [-]
All x86_64 Windows processor manufacturers are supporting the Microsoft firmware that offers secure attestation about whether your system was booted securely or not, so if by lockdown you mean “I can make undetectable changes to my system”, that’s coming to an end as rapidly as they can bring it to market (before Apple locks them out of it). You’ll still have the option to alter whatever you please, but software and websites will have the option to refuse service to your modified system.
userbinator 1234 days ago [-]
but software and websites will have the option to refuse service to your modified system.

Oh hell no! This is the scenario that Stallman warned us about, over two decades ago:

https://www.gnu.org/philosophy/right-to-read.en.html

floatingatoll 1233 days ago [-]
Citing a twenty plus year old manifesto with offers no guidance on how to reinterpret the idealistic view of “It is my right to modify” versus the modern day problem of “I must protect myself and others from malicious modifications”.

Stallman hasn’t been of much use for this, and his followers fervently cite his decades-old works each time I raise this point — but meanwhile, this is available today to any website accessed by any Apple Silicon MacBook, and it’s already deployed and in use in the (Linux) servers powering Azure Cloud, and it’ll probably reach the consumer Windows market in the next year.

Now what? After we get past “RMS told us this would happen”, after we blow off steam about “My ideals are being violated”, is there anything left to discuss and consider here?

I thought there was — for example: “Is it possible to reconcile the conflicting needs of safety and modifiability?” — but the prevalence of replies like the above over the past few weeks makes me think that I’m mistaken, and should simply let this go unnoticed until it’s too late and irrelevant what anyone believes.

rstuart4133 1233 days ago [-]
> the modern day problem of “I must protect myself and others from malicious modifications”. Stallman hasn’t been of much use for this

Clearly we differ, because it looks to me like Stallman put forward the only plausible solution to someone taking control of my system without my knowledge or permission: open source. Granted, that wasn't what he was trying to do at the time, he was as you say perusing the "It is my right to modify" line. But that's how it's turned out.

You are apparently perfectly happy to have Apple / Google / Microsoft or whoever install whatever backdoors and spyware they please on your system. It's not like you have a choice, or will even know, so it's probably best you've made your peace with it; just as people have made their peace with Facebook, Google Chrome not deleting their own cookies, or Microsoft refusing to copy files because some virus checker had a copyright violation signature fire (that's actually happened to me).

Maybe I'd even be OK with trusting those companies, but I definitely draw the line with governments granting themselves the legal right to rummage through that same cookie jar, which is exactly what the Australian government did with it's Assistance and Access bill. [0] I'm sure all governments do the same thing of course, including the Chinese government. A good indicator of how seriously the governments themselves take this threat is how Huawei is being treated by Western Governments. I have no doubt the Russian and Chinese view western gear with the same level of suspicion. To me that is the only sane position to take.

Just to be clear, I'm not saying TPM's and the DRM they enable aren't useful, but the problem is the lack of visibility into what these black boxes you are carrying around with you and putting in your living room. If you know what those boxes are doing, locking them down so hard they can’t be compromised by someone who has physical access to them is a nice addition, although the threat scenario (someone who has physical access) is very limited so perhaps not a major addition. But what you seem to be applauding is locking them down so hard even you, who has physical access, can’t see what they are doing, and then you go on to pillar a person who proselytized making all software transparent, so everyone could see if there systems system are running software they approve of, and not malware or worse.

[0] Quick summary: the Assistance and Access bill gives the ozzie government the right to force any company to write spyware that won't be detected by their OS's (that's the "Assistance" but), and then install it via their auto patch systems onto to device they nominate (that's the "Access" bit).

floatingatoll 1233 days ago [-]
My personal opinion on this technology isn’t included here, and I’ve made a point of withholding it each time, specifically to deny the opportunity to invoke the messenger’s feelings as relevant to the issue at hand. What I feel about this doesn’t matter, because this is already live in two marketplaces and headed rapidly to a third. I could be pro, I could be con, I could be both/mixed or uncertain/apathetic (hint: it’s not a purist view on either side of the fence). Discussing my viewpoint isn’t even possible, yet half of the words in your reply are dedicated to your speculation about it.

Focus your energy on the real issues at hand:

How are we going to adapt to the reality of secure attestation? How are we going to confront it with technology? How should we legislate to protect against abuse of it? How can we make use of it appropriately?

My goal is to raise awareness, and based on the other half of your reply, I’ve succeeded with one person. That’s progress, I suppose.

pfranz 1234 days ago [-]
At least in the US, my understanding is a 1975 law, the Magnuson–Moss Warranty Act, protects warranties on modified products as long as the modification didn't cause the issue. It comes up all the time with cars since there's an active mod community.

That being said, I usually backup and "factory restore" my computers before servicing. Mostly because I don't want to hand out my password or hand over my personal data, but I expect them to test that its fixed and that's easier in a generic OS. I also think it'd be odd to hand over a Macbook Pro (or even a Microsoft Surface) with Linux installed and expect their random, low-level tech to asses things.

the_only_law 1234 days ago [-]
I’m aware that Windows is locked down and even with Linux or *BSD parts of the platform you’re running on are mildly concerning at least. However, as you said I can still alter what I wish, whereas a lot of theories I’ve heard about Apple here recently involve turning the Mac platform into some sort of crippled sandbox in the spirit of mobile devices, or worse like some consoles that are so locked down the hackers haven’t beat it yet.

I do believe Microsoft would love to head down a path like that, but are too burned with the massive frankenstein of legacy tech that is Windows. I also tend to share a bleak view of the future of computing as many of the people voicing concerns, I just don’t believe that a niche set of consumers can overpower that of the majority who could care less if you ranted to them about control or privacy or whatever.

amelius 1234 days ago [-]
> If, in the case Apple tries some of the outlandish theories out there, I’ll simply cease to use it and probably trash Apple then as it will be useless to me, but I’m not locked into anything.

And what if it is too late? E.g. Apple owning the entire space you want to develop apps for? Or, if this seems an impossible theory to you, how about competitors copying Apple's model?

Please understand that as a consumer you have power, but if you don't use it you can end up like the proverbial boiled frog.

RONROC 1234 days ago [-]
I can understand this level of pearl-clutching when we’re talking about social media, but I just don’t see Apple as the five-alarm fire it’s made out to be.

Practically speaking being “locked” into a system would only be a possibility in a world where highly distributed, massively supported OS’s (Windows, Linux) didn’t exist.

Anti-apple people have a hard time fathoming the degree to which many other “power users” are okay with privacy/security trade offs if it results in a more robust/stable environment that just works.

Problem is, it’s being framed as some sort of Faustian bargain when it’s really not that big of a deal.

skulk 1234 days ago [-]
> that just works

Until it doesn't. I can open my Thinkpad (W530) and replace parts in it that die. Can you say the same for your Macbook? Are you okay with a future where everyone follows Apple's example because it generates more profits, where right-to-repair is a thing of the past? I know that sounds alarmist, but it's definitely the trajectory I see.

wulfklaue 1233 days ago [-]
The EU is introducing legal laws that will combat planned obsolescence. And that protect the right to repair. The EU is also fed up with manufactures thinking they can get away with everything. Those laws will also have a effect across the world because its easier to design on product that applies to the strictest laws, then two products.

So yea, Apple their gravy train is going to run into a massive road block. In the US states are also starting to rectify laws that push the right to repair.

Its one of the reasons why Apple "suddenly" allows repair shops to buy parts. As a way to change the narrative. Of course that buying parts has "issues" Apple style. A grifter is going to grift.

pixel_fcker 1234 days ago [-]
People that buy MacBooks don’t care, or at least they don’t care enough to buy something else.

As to your second point, MacBooks have been available for how long now? And yet thinkpads still exist.

skulk 1234 days ago [-]
> And yet thinkpads still exist

The X1 Carbon doesn't have upgradeable RAM. On some level, I'm convinced that Lenovo wants to follow in Apple's footsteps and is testing the waters with the X1 Carbon series.

noisy_boy 1234 days ago [-]
X1 extreme does; both RAM and SSD can be upgraded
ogre_codes 1234 days ago [-]
When I have parts in my MacBook Air die, I guess I'll find out. It's 10 years old now and doesn't show any sign of breakage. I'm pretty sure it's paid itself off several times over.

Since the new Air lacks even a fan, it seems even less likely to need service.

skulk 1234 days ago [-]
This reads like: "I guess I'll find out whether my privacy is being respected when I actually have something to hide."
jmnicolas 1234 days ago [-]
You have the same power than someone that votes: 1 vote which is not much.
CalChris 1234 days ago [-]
Homebrew is still a work in progress for Big Sur, let alone M1, and brew upgrade on an x86 Mac will give you this message:

  Warning: You are using macOS 11.0.
  We do not provide support for this released but not yet supported version.
  You will encounter build failures with some formulae. Please create pull requests instead of asking for help on Homebrew's GitHub, Twitter or any other official channels. You are responsible for resolving any issues you experience while you are running this released but not yet supported version.
steipete 1234 days ago [-]
It works surprisingly well though, been using it since the early betas.
CalChris 1234 days ago [-]
Well, I did have one problem when I upgraded homebrew on Big Sur. For some reason I got the `brew link unbound` returns `/usr/local/sbin is not writable` error. I looked on Ask Different and found (for an earlier version of OSX) this [1] Basically, mkdir the directory.

[1] https://apple.stackexchange.com/questions/312330/brew-link-u...

gbil 1234 days ago [-]
that was the same with catalina when it first came out, same warning so I don't think this will be an issue indeed, also in catalina it worked fine regardless of the warning
mlindner 1234 days ago [-]
So just bypass Homebrew and install your own stuff. It's not that hard. Homebrew isn't at all required, it's just a convenient tool that slightly simplifies some things.
pilif 1234 days ago [-]
Those things are very much worth simplifying: as more and more packages are starting to rely on pkgconfig, getting the headers recognized that ship with whatever SDK you are currently using (Xcode, command line tools) is very tricky and sometimes requires patching the build tools.

Homebrew is the perfect glue between the traditional Unix world and the somewhat unexpected Unix environment that macOS is providing these days (mostly a consequence of Xcode wanting to be a self-contained .app and /usr being read-only while still honoring the tradition of leaving /usr/local alone for you as a user)

mlindner 1234 days ago [-]
This actually simply isn't true. Firstly, pkgconfig works perfectly well on MacOS and always has. It sounds like you more have issues with Xcode than you do with anything else. I've not heard of a commonly used package requiring any sort of patches to get it to compile on MacOS. MacOS is a first class citizen of the open source world and things generally work perfectly for it.

Yes you need to spend a bit of work to find out what dependencies you need, but the configure scripts will tell you that when they fail.

> Homebrew is the perfect glue between the traditional Unix world and the somewhat unexpected Unix environment that macOS is providing these days (mostly a consequence of Xcode wanting to be a self-contained .app and /usr being read-only while still honoring the tradition of leaving /usr/local alone for you as a user)

On Mac packages have always naturally installed into /usr/local and not /usr.

pilif 1233 days ago [-]
The problem is that the macOS SDK doesn’t ship pkgconfig files for the libraries included with macOS.

Unless you go the extra length, the “easy fix” is to compile your own version of those libraries which is fine until you need to be linked against a binary shipped with the OS (say you are trying to compile an Apache module).

At that point you will have conflicting symbol names between the binary that shipped with the OS and depends on its libraries and your binary that depends on the self-compiled libraries which might or might not have matching versions and/or custom patches.

Libedit, libxml2 and many others are samples of this. Their binaries are in-fact installed in read-only /usr as shipped by apple, but the matching include files live in the Developer folder of either the command line developer tools or Xcode.

No open source package will be able to link to the system libraries by default with this split and when all they support is pkgconfig, you’re SOL without additional manual work

safetyscissors 1234 days ago [-]
I am a iOS developer and I have an M1. There are some issues that need to be ironed out with some packages that I use not compiling for some reason on the arm iOS simulator, but that has diverted me to compiling on device which isn't much of a hassle.

I am not bothered as much by the 16gb of ram, it is still rather usable. What I really bought the machine for was the battery life. I've had it for a day or two and it is amazing how long I can stay away from my power socket. Also this thing runs really cool. I have not even heard the fans spin up once, even during the recent Sydney heatwave (45°C).

__jem 1234 days ago [-]
I've been disappointed in Apple's hardware offerings for developers over the past few years and almost swore them off, but have a renewed interest in this machine. However, it's not because of anything Apple's done in particular, but because I'm in the process of fully transitioning my workflow to the cloud, and using a laptop only as a thin client.

Of course, there will always be a need for running dev toolchains locally, but I wonder how many other people are like me who would rather use Linux over an internet connection, and really only need a terminal emulator and an IDE.

nsm 1234 days ago [-]
This seems like a great use case for a chromebook. Have you looked into that? It seems like the Apple premium may not be worth it. You get long battery life, a very slimmed down machine that can run a browser and a terminal, and they are just as slim as a Air + much cheaper.
sally1620 1233 days ago [-]
Chromebooks would be a perfect machine. Except that most chromebooks don't have premium features: dim screens, capped at 1080p, cheap Intel CPUs, plastic chassis, ...

I am still waiting for a premium chromebook with ARM CPU and great battery life to hit the market.

neurostimulant 1234 days ago [-]
There is probably plenty of people like us that do this. VS Code remote development feature is quite popular which indicates many people are going with this setup.
rbrtl 1234 days ago [-]
Why don't you buy an iPad and run a terminal/VNC app?
__jem 1234 days ago [-]
I’ve thought about it! The keyboard is too important for me, and still being able to checkout source code locally and run a local ide tips it in the favor of laptops.
pfranz 1234 days ago [-]
Assuming you prefer Apple's keyboards, I'm pretty sure the Magic Keyboard with Bluetooth works with iPads. The downside is you'd have to manage charging the keyboard and the iPad, but on the bright side you could choose to get the keyboard with the numeric keypad.
rbrtl 1234 days ago [-]
I have heard really good things about the keyboards for the iPads. I’m still using an iPad Air 2 with a Logitech Bluetooth keyboard which works wonders.

There are code editing apps for iOS, but I don’t know what they're capable of these days. Some use cloud servers, but as soon as you know you want the option then the laptop is probably the way to go. Happy hunting!

ultronism 1233 days ago [-]
grab ipad with bluetooth mechanical keyboard
thomasjudge 1234 days ago [-]
Are you using SAAS tools, or remote desktop to a cloud VM?
cyrksoft 1234 days ago [-]
I travel and move around a lot (not so much now, but I still move around my country and to the office), so portability is very important to me. Another advantage of the MacBook line is that I can carry one Anker USB-C & A charger and charge everything I have.

I was looking into buying the new MacBook Air. Anybody has any particular alternative to recommend? I am looking for a new laptop and haven't found anything similar to the MacBook Air (I hate the touchbar) for a similar price (the 16GB of RAM model, if expandable better). Anything similar is either a lot more expensive or the build quality is a lot worse.

Any recommendation is more than welcomed!

6gvONxR4sf7o 1234 days ago [-]
Seems like a cool device. It’s too bad that apparently you need weird workarounds for more than one external monitor for the M1 MacBooks, which is my own personal dealbreaker.
bcit-cst 1234 days ago [-]
I still can't decide. I am wanting to get into ios development after 10 years. which computer should I go for.

What is opinion on these m1 machines longevity. Some tech people are comparing it to first gen iPad.Like they will be sunset after a couple of years vs 2nd gen which are supported much longer. but I am not so sure most of the issues with these first gen m1 machines are software so in theory they should be supported for long time.

pfranz 1234 days ago [-]
Apple still plans on releasing new Intel macs. I don't even think buying one of those is out of the question for some people.

If you look back at the Intel transition, if I remember correctly they released the MacBook Pro with a Core Duo in January of 2006 and Core 2 Duo in October. The latter being 64-bit. Sure, the latter ones were better, but the former ones worked fine. OS support seems to go with architecture generations. The early 2006 laptops supported Snow Leopard which had its last release in 2011. The next laptops supported Lion which had its last release in 2012 (but with hacks could get updated until 2018).

I think the major question is if you want to wait to see what the "pro" machines look like or if Apple has tweaks in 6 months to the current line up. The Mac Pro has some big questions about what may need to change if they're looking for parity with the current Mac Pro (improved multi-monitor support, more RAM options, upgradable GPUs). Those changes may show up in the high end Macbook Pros (likely not upgradable GPUs).

kalleboo 1234 days ago [-]
The gen 1 apple devices with the worst support lifecycles have been

* iPad 1, which was basically a iPhone 3GS with a bigger screen, and from day 1 didn't have enough RAM to drive the big screen

* First Intel macs, which were 32-bit, before Apple quickly transitioned everything to 64-bit

* First Apple Watch ("series 0") which was just underpowered in general

The M1 Macs really don't appear to have any obvious flaws like these.

emp 1234 days ago [-]
You will be inside of Xcode for most of your time - I just upgraded to a 13" Air (maxed out configuration) from my 2012 11" Macbook Air which was still fine with Xcode. I think any Mac will run Xcode well enough.
lmedinas 1234 days ago [-]
I have the new MBP and I can tell you its a remarkable machine. Of course you need to check which SW you need to run and if it works (I assume it will). I guess everything else, native versions, will come with time.
mlazos 1234 days ago [-]
I was confused why it’s hard to get a license for windows 10 on ARM. Doesn't Windows 10 already support ARM?
simonh 1234 days ago [-]
Microsoft sold some Surface devices with ARM Windows 10, but they have never sold it as a standalone installable SKU for use on third party devices.
mlazos 1234 days ago [-]
https://docs.microsoft.com/en-us/windows/uwp/porting/apps-on...

This says you can run it on arm with fall creators update or newer.

izacus 1234 days ago [-]
That doesn't mean Microsoft will give you the installer for ARM Windows so you can use it on anything but Surface X.
my123 1234 days ago [-]
1234 days ago [-]
gpapilion 1234 days ago [-]
I’m surprised at all the reviews talking about 16gb being enough, but with virtualization basically not excusing at this point I’m scratching my head.

My current computers feel like they have enough memory until I load virtual box, or VMware.

tomduncalf 1234 days ago [-]
The performance of these is very impressive - selfishly, I'm glad to read that the 16" is still faster for Xcode tasks (as I have the same spec as Pete and some of the commentary about the speed of the new models was making me a bit envious!), but I have no doubt the Pro models of these will be remarkably performant.

Personally I will try to wait it out until they release a new design... although may end up with a Mac Mini build machine

stevegeek 1234 days ago [-]
Sorry if this has already come up but apart from the excitement about performance what is the build quality like for the new MacBooks? The main issues I’ve been facing from Apple lately have been physical failures from keyboards to screens and ports. The 16” is better but already I’ve found some keyboard keys get unresponsive occasionally and reported battery health is deteriorating fast after only a handful of cycles...
LexGray 1234 days ago [-]
Most of the complaints I have seen online are will not power on, DOA type issues. Not many yet (some review units out of box etc).

Apple deems itself a consumer company and perfection is too expensive. Apple quality is generally going up every year, but slowly.

Apple makes incremental improvements manufacturing nearly weekly just like software builds. I always say never buy first gen, or at least never on the first day. Never assume your machine is just going to work.

snazz 1234 days ago [-]
Why is WindowServer using over 2 GB of RAM in the Activity Monitor screenshot? That seems absurdly high. Does it have a memory leak?
rayiner 1234 days ago [-]
MacOS stores a full frame buffer for every window (visible or not), for compositing. If I recall correctly, the frame buffers of every active window are attributed to the Window Server. That can get big when there are lots of windows, especially on a high DPI display.
dawnerd 1234 days ago [-]
That’s also why all those single app benchmarks saying 8gb is enough are a bit silly. No one just uses one productivity app.
simonh 1234 days ago [-]
Particularly in scaled display modes as well.
1234 days ago [-]
phkahler 1234 days ago [-]
A full screen at 4K is 8 megapixels or 32MB or so. I guess if they double or triple buffer each app might take 100MB or so, but you'd still have to open 10 apps to reach 1 GB.
neurostimulant 1234 days ago [-]
Hmm, let me count my currently open windows at the moment.. 24 windows from 12 different apps.
bzzzt 1234 days ago [-]
10 app windows doesn't sound like a lot to me. Besides that, if you enable resolution scaling macOS renders at a higher internal resolution and lets the compositor scale it down. Try that on a 5k screen...
rsynnott 1234 days ago [-]
I mean, what of it? 10 windows doesn’t seem like many, at all. I tend to end up with at least ten terminal windows by the end of he day, nevermind everything else.
saagarjha 1234 days ago [-]
This is how WindowServer usually is.
snazz 1234 days ago [-]
Really? Huh. I’ve never seen it use anywhere near that much on my Hackintosh, but I don’t have a super high-res screen or anything like that. I wonder what the difference is caused by.
bobbylarrybobby 1234 days ago [-]
On my 2015 15” MacBook Pro, I have about seven apps open, about half with multiple windows, and an external monitor plugged in. WindowServer is using 200MB.
saagarjha 1234 days ago [-]
You're probably looking at RSIZE?
perardi 1234 days ago [-]
I am clocking in at 2.13GB on my 16-inch MacBook Pro.
aledalgrande 1234 days ago [-]
I thought a lot of stuff didn't work with Homebrew? https://github.com/Homebrew/brew/issues/7857

How do you run the command line tools through Rosetta once they are installed?

michaelbuckbee 1234 days ago [-]
The tip I saw was to set terminal to run as Rosetta and then the apps launched with that all execute through that (homebrew included).
steipete 1234 days ago [-]
You can use arch -x86_64 and install the Intel versions. Small perf overhead but not really noticable.
aledalgrande 1234 days ago [-]
Does MacOS auto detect that they are x86 binaries and run them through Rosetta automatically after installing?
saagarjha 1234 days ago [-]
Yes.
danieldk 1234 days ago [-]
Unless you use anything that does JIT compilation (e.g. JVM), e.g. JetBrains IDEs are really slow onder Rosetta.

Another pain point is code that relies heavily on SIMD. AVX/AVX2/AVX-512 instructions are not translated, so in those cases you'll be using slower kernels. My machine learned models are ~10 times slower with libtorch under Rosetta. Though they are competitive when compiling libtorch natively.

(Source: I have used an M1 Air for a week, but decided to return it.)

neurostimulant 1234 days ago [-]
Does this mean python performance is finally competitive with those fancy JIT languages now?
saagarjha 1234 days ago [-]
Nope.
andrekandre 1234 days ago [-]
> used an M1 Air for a week, but decided to return it.

out of curiosity, why did you return it?

dgdosen 1234 days ago [-]
I just have two versions of iTerm - one started via x86_64, and one started natively as arm64.

For now, I'm mostly using Rosetta2 for running iTerm. It works great. I do fine sluggishness on some initial builds, but then it's off to the races.

kaiwen1 1234 days ago [-]
I’ve been on a new M1 since the day after release. Love it. FAR faster than my previous Mac. But there is some work left to do. It crashes every day, usually more than once. Frustrating but still very happy to have it. Just bought a second one today for my wife.
mantap 1234 days ago [-]
What do you mean by 'crashes'?
sally1620 1233 days ago [-]
These first generations Apple Silicon Macs are really publicly available DTKs for developers.

I think Apple is expecting all developers to jump and port their apps so they are ready when the high-end Apple Silicon devices hit the market.

devit 1234 days ago [-]
Are there CPU manuals from Apple that explain how to optimize assembly code for their CPU and document any deviations from the ARM reference architecture?
fiddlerwoaroof 1234 days ago [-]
saagarjha 1234 days ago [-]
Ah yes, another documentation page that Apple found and ruined by "modernizing"…
saagarjha 1234 days ago [-]
None that are available externally, no.
VHRanger 1234 days ago [-]
Serious question, since I only have a little experience writing assembly:

How would optimized M1 assembly differ from optimized ARM assembly in general?

I imagine good ARM assembly code on a Gravitron2 or a Snapdragon chip would be similarly good on an M1 chip

saagarjha 1234 days ago [-]
There are always microarchitectural differences you can take advantage of, but Apple probably wants you to stick to the ISA as these things change in ways they don't want to document.

Apple of course uses these things in the lower levels of their software, so if you're using their APIs you're getting this for free across OSes and CPUs.

rsynnott 1234 days ago [-]
Micro-architectural differences. For instance, from the 386 to Pentium 3 (and on Athlon, I believe), shifting was cheap, due to the presence of a barrel shifter. This was dropped in the P4, and suddenly shifting was quite expensive (I think, though I’m not sure, that it may actually have come back in later P4s). That’s the sort of thing compiler writers need to know about.
userbinator 1234 days ago [-]
The P4 was quite an anomaly[1] in the history of x86, optimised for clock speed at the expense of many other things. All the subsequent generations went back to the norm, with the exception of the Atom series, and even that one is not as unusual as the P4.

[1] Even the family/model/stepping designation has an oddity: the 486 was family 4, the Pentium was family 5, and then everything from the Pentium II/III up to the latest Core series, as well as the Atoms, uses family 6; but the P4 was family 15.

cataphract 1234 days ago [-]
The costs of different operations differ between CPUs that implement the same instruction set. Compilers have costs tables for each processor family for when you target them (e.g. with -mtune)
mhh__ 1234 days ago [-]
It really depends on the actual design of the processor.

Some things like the instruction density and inlining heuristics translate from processor to processor, but the actual microarchitecture (rather than the ISA) determine what is "good" code from platform to platform e.g. Good instruction scheduling depends on how the CPU is designed.

skohan 1234 days ago [-]
Presumably a lot of the optimizations they use for their own compilers would be upstreamed into LLVM, no?
saagarjha 1234 days ago [-]
It seems like many have not for a number of years. A14 is still not even listed in LLVM's list of aarch64 targets :/
my123 1234 days ago [-]
It takes them a while. Apple A13 is there in the Apple LLVM target list at https://github.com/apple/llvm-project/blob/master/llvm/lib/T....

A14/M1 will come, but the question is when...

cozzyd 1234 days ago [-]
Very sad to hear, but unsurprising. That's what you get for not using a GPL compiler, I guess.
mhh__ 1234 days ago [-]
Indeed, although it's worth saying that GNU were offered LLVM and said no, they have suffered from their own success because new developers and culture within programming is now post-GNU where people can't remember the environment that led to the GPL becoming popular.
cozzyd 1234 days ago [-]
yes, and it's clearly something rms regrets:

https://lists.gnu.org/archive/html/emacs-devel/2015-02/msg00...

If rms had a sane way of reading e-mail, things might be different :).

amelius 1234 days ago [-]
Was this a fully parallel build (using all cores)?
peter303 1234 days ago [-]
Ah, beta testing new M1 software
sesuximo 1234 days ago [-]
I guess Rosetta can’t handle anything that jits.

Edit: this is totally wrong. see thread.

ViralBShah 1234 days ago [-]
It has been reported that Julia runs just fine under Rosetta.

https://github.com/JuliaLang/julia/issues/36617#issuecomment...

sesuximo 1234 days ago [-]
Wow that’s pretty cool. I also found https://developer.apple.com/documentation/apple_silicon/abou... which explicitly says Rosetta can handle jit. I wonder how it does that? It must emulate the jitted x86 instructions...
sesuximo 1234 days ago [-]
Edit: apparently it translates instructions when they are marked executable (based on speculation on HN)
viktorcode 1234 days ago [-]
Ahead of time translation is merely an optimisation trick. It still is an emulator.
sesuximo 1234 days ago [-]
Is this really true? I thought Rosetta translated x86 to arm

Tried looking for docs but didn’t find anything decisive either way

The_Colonel 1234 days ago [-]
Yeah, I think better question is what's the performance penalty of JIT on Rosetta 2.
mlthoughts2018 1234 days ago [-]
What an odd article. It spends several paragraphs describing nightmarish compatibility issues and lack of support or capability for various mainstream tools, but seems to have a completely unjustified and wildly optimistic point of view that all the problems are just short term transitional annoyances that will be fixed in Q1 2021 or shortly thereafter. It concludes M1 is worth the hype.

What a bizarre perspective with nothing to support it except apparently blind optimism.

macintux 1234 days ago [-]
Well, there’s also Apple’s success rate at pulling off transitions like this, and the fact that their developer partners know they have to either make the transition or abandon the platform.

Unlike Microsoft dipping its toes into the ARM waters, everyone knows Apple is committed to this.

totalZero 1234 days ago [-]
Apple pulled off a similar transition once in recent history (the other examples are so antiquated that they teach us little about the present hardware industry), and that was a move toward the rest of the personal computing market --- not away.

This is a step toward mobile/iOS. Apple isn't the same underdog that it was when PowerPC was put to bed, and the transition is challenging because the Mac platform is bifurcated in the interim.

PowerPC to Intel isn't a good proxy for this because the technology is different, the transition strategy is different, and the company itself is different.

If the M1 transition is to be smooth, it has to achieve that on its own merits, without looking back in search of comparable transitions past.

macintux 1234 days ago [-]
The complaints raised in this article are mostly irrelevant to the typical consumer.

Rosetta 2, from all reports, works well enough that I’d argue they’ve already succeeded in making the transition smooth enough to call it a success.

The main non-developer concern raised in this piece is extraneous dialog messages, which sadly we’ve pretty well all been trained to ignore. I agree they’re unfortunate, but they’re hardly a show-stopper.

The real risks Apple faces, as far as I can see:

* Will developers do the needful for universal apps? That seems to be well underway, even among the giants.

* Will a lack of x86-64 Windows emulation prove to be a deal-breaker for too many users? I’d say that’s impossible to know, but I’m optimistic. (And if Apple once again is setting a trend for the industry, Windows itself may fully make this transition someday.)

* Will Apple be able to keep up with AMD and Intel in the chip race? Obviously early signs are promising.

Big risks, but mostly ones Apple can control, or at least influence. What other dangers am I missing?

Adding: the dependence on TSMC is obviously a risk, but I imagine Apple has some notion of what they can do if they lose that option.

tosh 1234 days ago [-]
optimism might be justified: there are updates for apps coming out basically every day atm
Rapzid 1233 days ago [-]
I found that an odd but a reoccurring theme.

* Virtualization not supported

* Gradle is extremely slow

* Webkit crashes

* No Docker yet

* Builds not as fast as their top-end Intel based MBP

* Builds not nearly as fast as their Hackintosh

* Can't wait to migrate CI to it

wut

woahAcademia 1234 days ago [-]
Dev here, the burden of Apple far exceeds any claimed time savings. I think every dev knows exactly what I'm talking about. And if you aren't a dev, Apple treats us like replaceable Dogs.

Apple has killed any good will among nerdy developers by treating them like shit for 20 years. Sure there are still corporate developers that are forced to make iOS apps, but it's not like a hobbyist is going to buy an Apple computer for embedded, web dev, gaming, PC applications, etc...

Until Apple treats us(and maybe their non dev customers) better, I don't see any reason outside iOS Apps to ever buy a macbook for dev.

bnchrch 1234 days ago [-]
> it's not like a hobbyist is going to buy an Apple computer for embedded, web dev, gaming, PC applications, etc...

Yeah, this is simply just false.

Apple being a consumer grade Posix OS is exactly why hobbiest choose it. It has first tier support every consumer app you need, with no configuration headaches and has all the tools you want for modern development.

This is why most web devs today are still carrying Macbooks. Even if Apple has not been prioritizing building machines with beefy specs.

eyelidlessness 1234 days ago [-]
I’ve been doing web and server dev, and personal computing, on a Mac for those same last 20 years and I honestly can’t relate one bit to any of what you’re saying either as a dev or just an end user. In fact, the companies I feel most mistreated by are those who have pushed more and more cross platform (Electron & out of place UI/UX) onto the platform.

(A notable exception to that feeling is VSCode. While I would quite prefer a native app with its features, I’ll gladly pay the Electron tax because it’s the best editor I’ve ever used.)

Toutouxc 1234 days ago [-]
> but it's not like a hobbyist is going to buy an Apple computer for embedded, web dev, gaming, PC applications, etc...

This may surprise you, but yes, it's what people seem to be doing.

jb1991 1234 days ago [-]
Almost all the full stack web developers I know are using MacBooks for both front and back end codebases across half a dozen languages. It’s an advanced, modern operating system with nearly complete Linux compatibility.

It’s fine if it’s not for you, but to stereotype an entire part of the industry like this isn’t fair.

burnthrow 1234 days ago [-]
As they say, to know where the ball is headed, watch the full stack web developers. I know it was a half dozen because I had to use both hands to count.
herrkanin 1234 days ago [-]
Dev here, as well.

I really don't know what you are talking about. All my development work could probably be made on a linux computer, or windows, but I choose to use Mac just because it makes embedded work and web development so much easier.

chongli 1234 days ago [-]
I've long been a Mac user but never done embedded development. Just curious. What makes the Mac better for embedded development?
jkelleyrtp 1234 days ago [-]
I’ve gone through dozens of tutorials for installing tool XYZ for Mac to develop with embedded, and not once have I had to install drivers or mess with udev rules. Things just work when you plug them in.
burnthrow 1234 days ago [-]
Maybe unavailability of workbenches is a catalyst for creativity, har har.
notreallytrue 1234 days ago [-]
Nothing.

Most of the toolchains are OSS

durandal1 1234 days ago [-]
Maybe most developers are professionals that just want to solve problems for their customers, on the platform that their customers are using, instead of shipping 2nd grade experiences based only on unsubstantiated assumption of what future Apple will do?
alphabettsy 1234 days ago [-]
This is certainly a hot take. Every event for “developers” I’ve been to recently Macs outnumber anything else by a significant margin.

I can say for me personally that yeah I have my issues with macOS, but the amount of fiddling I have to do with Linux and the different patterns I have to learn with Windows means macOS will likely continue to be my choice for years to come.

jonplackett 1234 days ago [-]
As a hobbyist developer I definitely use a MacBook and will definitely buy another MacBook as soon as a 16inch 32gb MacBook comes out.
bzzzt 1234 days ago [-]
You can order it. Just not yet with an M1 CPU ;)
jonplackett 1233 days ago [-]
Yes, but that one is barely any better than my 15.4 MacBook Pro. Pointless upgrade
rsynnott 1234 days ago [-]
I’m a dev; fled to MacOS in 2005 or so because Linux on a laptop was just such a nightmare, and never really looked back. I gather things are better now, but still not _great_, and life is too short to have to sacrifice a goat every time a kernel update breaks your Bluetooth audio or whatever.

Apple’s far from perfect, but there is something to be said for ‘just works’.

sliken 1234 days ago [-]
Heh, been running Linux for a decade plus and don't remember any such problems recently.

Amusingly you mentioned bluetooth, apparently the new M1s are having serious bluetooth issues, so much that those with the apple bluetooth mouse and keyboard are resorting to wired devices.

friedman23 1234 days ago [-]
> Dev here, the burden of Apple far exceeds any claimed time savings.

I have the complete opposite experience. I don't feel any burden from apple when using a mac. MacOS being a unix like operating system makes developing on it a breeze. On windows i need to use archaic developer tools like powershell. Linux commands I use on my production servers don't work on my development environment. If I want to use git in a sane manner from the command line i need to install an emulator that has its own issues.

There is a reason almost all companies in silicon valley use macbooks for their developers.

> but it's not like a hobbyist is going to buy an Apple computer for embedded, web dev, gaming, PC applications, etc...

As someone else has said, this is exactly what people seem to be doing.

oblio 1234 days ago [-]
> On windows i need to use archaic developer tools like powershell. Linux commands I use on my production servers don't work on my development environment.

I mean, you can dislike Powershell and love Unix tools, but your choice of words is remarkably funny.

If anything, Unix tools are archaic, generally organically grown and not that designed or designed based on principles from 50 years ago, in many cases principles that have been superseded.

Powershell actually has a design and a modern one.

Again, you might like one and dislike the other, it depends a lot on personal preference and familiarity.

But... Poor choice of words :-)

sidpatil 1234 days ago [-]
> On windows i need to use archaic developer tools like powershell.

What makes PowerShell archaic, in your opinion?

friedman23 1234 days ago [-]
It's not bash. It's proprietary garbage. It's not what I'm going to be using on my production servers. Why it exists? I have no clue.
sidpatil 1234 days ago [-]
I'm not sure what you're referring to specifically when you say "proprietary", but most of PowerShell is available under the MIT license.

https://github.com/PowerShell/PowerShell#legal-and-licensing

pjmlp 1234 days ago [-]
Hey Apple stuff is also proprietary garbage, why are you using it for your development work then?

Two weights, two measures?

rbrtl 1234 days ago [-]
Apple's terminal environment is POSIX-like. PowerShell was always a bit of a lame duck, or Microsoft wouldn't have bothered with WSL to get real power users to stick with their platform.
pjmlp 1234 days ago [-]
Powershell is the only mainstream shell that is close enough to the Xerox PARC workstations REPL experience.

As for WSL you got it all wrong, after the failure of Project Astoria (running Android apps on UWP), Microsoft found a business opportunity in selling Windows to folks that buy Apple devices to do GNU/Linux work instead of supporting Linux OEMS, unhappy with Apple no longer caring for them as they no longer need their money.

So they picked up the infrastructure, redone it as WSL, and start selling the feature to that crowd, now they can get the hardware that Apple doesn't sell them, while keep paying proprietary garbage vendors instead of supporting the vendors in the Linux community.

Scarbutt 1234 days ago [-]
I don't use windows but in your case it solves this problem better with wsl2 since you can have the same real linux that runs on your servers.
thekyle 1234 days ago [-]
I really wanted to like WSL 2 but it has a lot of bugs that may take years to get ironed out (unusably slow disk IO, memory leaks, virtual disks that grow forever, etc.)

I think it would probably work for web development as long as you keep everything inside WSL, but for other development tasks you're going to run into issues real quick.

https://github.com/microsoft/WSL/issues/873

https://github.com/microsoft/WSL/issues/4197

https://github.com/microsoft/WSL/issues/4166

https://github.com/microsoft/WSL/issues/4699

nickjj 1234 days ago [-]
I use WSL 2 for full time development on the stable release of Windows 10.

The last 2 issues are an issue as of today but you can workaround them where it becomes a non-issue in the end by setting 1 config file value and running 1 command maybe once a month.

Also if you keep your source code inside WSL 2's file system then the first 2 issues are non-issues in practice.

Cu3PO42 1234 days ago [-]
Anecdotally I've been doing development in C++, Rust, .NET and other languages just fine in WSL2.
JacobSuperslav 1234 days ago [-]
i got a feeling we are not the target of their hardware
Razengan 1234 days ago [-]
> I think every dev knows exactly what I'm talking about. And if you aren't a dev, Apple treats us like replaceable Dogs.

No. Stop presuming to speak for all of us. Apple has generally treated me a lot better than Microsoft did, as a user and as a developer.

Downvoting the other side may give the impression that everybody is of the same opinion in this echo chamber, but it won’t change the reality that many people are happy with Apple and excited for the direction they’re taking.

CyberRabbi 1234 days ago [-]
This is the computer experience I’ve waited my entire life for. I’ve been working on this thing for a week straight (I’ve literally been awake 24 hours a day for 7 days) and it’s still at 99% battery life.

Black. Magic. Fuckery.

andrewnicolalde 1234 days ago [-]
Something tells me your computer is plugged in ;)
vinteruggla 1234 days ago [-]
So is it true that the components might degrade faster on this machine because of heat - the fact that it is without a fan?
banana_giraffe 1234 days ago [-]
I'd be less worried about components and more worried about a human being awake for a week straight.
CyberRabbi 1234 days ago [-]
No this thing literally emits zero heat
saagarjha 1234 days ago [-]
Meanwhile your comments seem to just be hot air ;)
CyberRabbi 1234 days ago [-]
If you had an M1 you’d understand
mhh__ 1234 days ago [-]
https://en.wikipedia.org/wiki/Landauer%27s_principle

There's a Nobel Prize in there somewhere, get cracking

CyberRabbi 1234 days ago [-]
Yes the M1 is revolutionary and Apple definitely deserves a Nobel for all the innovations they’ve done to deliver this industry-changing technology.
vaccinator 1234 days ago [-]
I'm starting to think that Steve Jobs was holding Apple back...
iammyIP 1234 days ago [-]
calm down, since 10 years cpu speed doesn't really matter anymore, and if you aren't involved in the apple ecosystem hulahoop (or javascripting - god forbid) i think m1 is not very interesting untill it allows other oses to run on free hardware.
fartcannon 1234 days ago [-]
All these developers are wasting time and money to help support a company that is actively working to lock down personal computers.

Think about the future where a new developer needs to pay a license fee, and have their software reviewed just to show his friends.

eyelidlessness 1234 days ago [-]
Or these developers value different things than you and they’re spending their time how they want to benefit from those different values.
pydry 1234 days ago [-]
More likely they're simply not considering what the future will inevitably hold.
eyelidlessness 1234 days ago [-]
I’m sorry but this is just elitist nonsense. It’s disappointing to me that so many here simply can’t conceive that some people don’t share their priorities, and make conscious choices that reflect their own priorities.

I don't care if the inevitable future of the Mac platform is more locked down. In fact I welcome it. Does it limit what I can do with my device? Of course. Do I lament some of the control I’ve already given up? Sure. But the trade off is a safer and smoother computing experience that allows me to focus more on what I want to achieve with what the platform does provide.

I spent some time on Linux, I got into hyper-customizing every minute detail, I swapped out third party RAM and even early adopted SSDs by mounting them in the optical drive bay. I built a PC tower from personally selected components. I triple booted Windows and Linux on my first Intel Mac with community built boot loaders well before Boot Camp was a thing.

I can’t imagine spending my time that way anymore. I learned a lot, but now I just wanna get stuff done. I like having some of the system out of reach. I like knowing that it’s readonly and I like the kind of hardware advances Apple’s vertical integration has enabled.

It’s okay if we value different things, and it’s okay if we make different compromises to experience them.

_dibly 1234 days ago [-]
I don't understand, so you don't care and it's okay if he cares, but if he says that people not caring is part of the problem then he's an elitist? What is your argument here, that he has no grounds to point out the apathy toward the long-term impact of your decisions?

You go so far as to validate the things he's concerned about and reiterate that you just don't care about them. I don't see what's wrong with pointing out the inherent shortsightedness of that line of thinking.

>it’s disappointing to me that so many here simply can’t conceive that some people don’t share their priorities, and make conscious choices that reflect their own priorities.

Have you considered that those people understand basic human psychology and that they take issue with the exact priorities that they are speaking against?

eyelidlessness 1234 days ago [-]
> I don't understand, so you don't care and it's okay if he cares, but if he says that people not caring is part of the problem then he's an elitist? What is your argument here, that he has no grounds to point out the apathy toward the long-term impact of your decisions?

What problem? How do my decisions to prefer one computer platform have any impact on GP or you or anyone? Apple turning the entire Mac line into iPads can’t and won’t eliminate more open platforms that you or GP may prefer. There’s no future inflection point where Apple has locked down the Mac platform in a way that enables them to storm into your home and replace your Linux computer (or whatever) with a Mac.

It’s elitist because the suggestion is that my preference represents an inability to evaluate the impact it has on me. It has no meaningful impact on you. It would have no meaningful impact on you if I decided to eschew all computing technology and go live in a monastery.

> You go so far as to validate the things he's concerned about and reiterate that you just don't care about them. I don't see what's wrong with pointing out the inherent shortsightedness of that line of thinking.

I am allowed to have different preferences and priorities. It’s not shortsighted, I see the compromise I’m making and I’m satisfied with what it provides in return. And since it in no way harms you or GP, it’s not your business to tell me to change my preferences.

> Have you considered that those people understand basic human psychology and that they take issue with the exact priorities that they are speaking against?

Have you considered that taking issue with other people’s preferences and imposing your own is far more controlling than just accepting that some people like different things, even if their preferences are personally limiting in a way you’re not comfortable with for yourself?

1234 days ago [-]
marcinzm 1234 days ago [-]
Why it is so difficult to consider that your calculations of risk-vs-reward are not universal nor inherently correct? They likely did the math and found that the risk is worth the reward to them. Actual money in the hand now is worth more than theoretical money in the future and so on.
_dibly 1234 days ago [-]
I think it's funny that you accuse someone of not considering the alternatives with the supporting argument that 'they probably considered the alternatives'. What makes you think a for-profit company is thinking about the long-term negative impacts of their business on the industry when that's rarely the case in a free market?

>Actual money in the hand now is worth more than theoretical money in the future and so on.

Exactly, short-term profit will almost always trump long-term problems. Is that so difficult to consider?

marcinzm 1234 days ago [-]
This is a discussion regarding third party developers who use the Apple platform. I honestly have no clue how your statements tie into that in any way.
_dibly 1234 days ago [-]
Third party developers are typically for-profit, and your argument was that those developers are capable of factoring in the long-term impacts of their decisions over the short-term profits when in reality that is very frequently not the case.

>They likely did the math and found that the risk is worth the reward to them

GP is saying that they likely did not do the math or that they prioritized short-term gains over the long-term impacts of their decisions as for-profit entities typically do.

I'm not sure what there was to misunderstand about my original comment since I more or less said exactly that.

marcinzm 1234 days ago [-]
>GP is saying that they likely did not do the math or that they prioritized short-term gains over the long-term impacts of their decisions as for-profit entities typically do.

No, GP said they did not consider future impact, period. I said that entities can consider future impact and still choose short term profit (and that it is often rational to do so). I then pointed out that assuming someone else didn't do the math because their results are different than your results is egoistical.

_dibly 1234 days ago [-]
Those things aren't mutually exclusive.
db579 1234 days ago [-]
That's even worse. Accidentally disagreeing with my values would be one thing but doing it on purpose... beyond the pale!
viktorcode 1234 days ago [-]
I think they aren't wasting time and money. Look at the article's author -- they are investing time to earn money selling their product on this platform.

Also, your point about Mac future is pure speculation at this point, which contradicts public statements from Apple.

_dibly 1234 days ago [-]
Apple contradicts public statements from Apple pretty consistently, they have a habit of saying they won't ever do something right up until they do it. I don't think there's anything unreasonable about those types of concerns because when/if Apple does go back on their public statements you won't have much recourse.
pjmlp 1234 days ago [-]
I used to live in that past, and you know what, it was great to be able to sell development tools to developers instead of being forced to work only with enterprise customers, the only ones left willing to pay for tools.
mhh__ 1234 days ago [-]
Who will think of Apple's revenues in these trying times?
pjmlp 1234 days ago [-]
Everyone gets to pay for their tools, in some form even if it means getting them n-hand out of some flea market, just in some software development circles for some strange reason we get a bunch of people that feel entitled to earning money without paying others for their work.
maxpert 1234 days ago [-]
With this argument iOS should have never taken off. Yet it’s better than “open” Android standards. Thing is Apple rewired people to stop thinking about RAM and CPUs and made developers write software that works on 4 year old iPhone. Try that in Android or Windows. Yes there is a down side to it but as long as consumers can win with the outcome I personally think it should be fine.
1234 days ago [-]
yuy910616 1234 days ago [-]
I think the value of app stores is going to continue to diminish as web apps gains traction. Look at webapps like Figma; it's quite powerful.
heavyset_go 1234 days ago [-]
Apple purposely doesn't implement standards that would make progressive web apps possible on their platforms, and they don't allow third party applications on their mobile platforms.
rimliu 1234 days ago [-]
Can we retire this already? I have been hearing it for ten years at least. Or will it stick around like the "year of the Linux desktop"?
1234 days ago [-]
Razengan 1234 days ago [-]
It’s no different than consoles, and many users and developers are okay with it. For those who are not, there are other options.

Why is that so difficult?

boudewijnrempt 1234 days ago [-]
"there are other options."

Until the day when there aren't other options -- which isn't so difficult to understand, really.

acdc4life 1234 days ago [-]
Slippery slope, with all evidence pointing to the contrary. Apple will never have a monopoly, linux will always be an alternative.
makapuf 1234 days ago [-]
Not on apple hardware however.
rbrtl 1234 days ago [-]
So... don't buy it? Unless you think it's the best hardware on the market at the best price point.

And what makes you so important that every hardware provider has to bend to your will? I can't run Shor's Algorithm on my Raspberry Pi, whose fault is that?

ulisesrmzroche 1234 days ago [-]
Then why don't you rail on Linux? If one day there won't be any other options left, why not aim that way? Presumably that's a worse betrayal
rbrtl 1234 days ago [-]
How will Apple take away your options? That's what's difficult to understand. Either the FOSS/DIY approach has merit and will continue to thrive, or it'll die because it becomes impractical, neither of those courses have anything to do with Apple trying to prevent Joe Bloggs becoming a node in a botnet.

EDIT:

> isn't so difficult to understand, really

You may be confusing "understand" with "imagine"

1234 days ago [-]
Razengan 1234 days ago [-]
Alright. How will there be no other options one day?
Razengan 1234 days ago [-]
As expected, no answer.
kevingadd 1234 days ago [-]
"It's no different than consoles" is a pretty strong statement to make unsupported. There are many differences. You can make a more direct comparison between say, an iPhone and a game console, but a general-purpose laptop or desktop has very many differences from a game console, and ought to.
riotman 1234 days ago [-]
> "It's no different than consoles" is a pretty strong statement to make unsupported. There are many differences.

Like what? ps5/xbox series x/ps4/xbox one are very sophisticated, comparable to a modern desktop PC, yet it's totally closed off. Consoles no longer resemble embedded devices like previous gens. Heck, even the original nes used the MOS 6502 chip, which was very popular in PCs in the early days.

mhh__ 1234 days ago [-]
Consoles exist as an exception rather than the rule. You should be able to run whatever you want on what is basically an x86/RDNA2 gaming PC.

The openness of the PC should be enforced legally across anything that can be reasonably construed to be one.

Razengan 1234 days ago [-]
> The openness of the PC should be enforced legally

Oh fuck no.

If you want a PC you can get a PC. Stop trying to force your mediocrity-for-all BS on those who don’t want PCs.

cma 1234 days ago [-]
Interesting that Java "write-once, run anywhere" stuff ends up being the slowest due to having to run through emulation if it interfaced with native code (native compiled code: fast; java jit for the right arch: fast; java jit for the wrong arch due to old native integration: super slow).
saagarjha 1234 days ago [-]
A server JDK running under Rosetta is actually not too bad, performance-wise.
dukeofdoom 1234 days ago [-]
Thinking of buying the mac air laptop, will my Python 2.6 and Django project run on this?
ehutch79 1234 days ago [-]
You should really think about porting the python 3. It's not actually that bad unless you heavily use something that changed. I ported a major django project last year and it wasn't a major problem or time sink.
CharlesW 1234 days ago [-]
A Google search suggests "yes".

It's worth noting that Apple has a 14 day return policy, and on the rare occasions I've used it it's been hassle-free.

woahAcademia 1234 days ago [-]
Depends on your libraries.

Also I'd highly recommend coming up with specifications for your computer you want.

Don't just buy brands because you've seen them on tv.

Darmody 1234 days ago [-]
It saddens me to see devs locking themselves in Apple's golden jail.
throwaway4good 1234 days ago [-]
Most developers I know use already a mac. It is a faster system for similar hardware (maybe due to the filesystem or something) and its Unix insides support development work better than Windows.

M1 is just going to make macs even more dominant.

And AMD is not going to rescue Windows/Intel. These night and day improvements are simply too big to be explained by TSMC being a better fab than Intel.

hu3 1234 days ago [-]
https://insights.stackoverflow.com/survey/2020#technology-pl...

- Linux 55%

- Windows 53%

...

- MacOS 24%

"Linux and Windows maintain the top spots for most popular platforms, with over half of the respondents reporting that they have done development work with them this year. We also see some year over year growth in the popularity of container technologies such as Docker and Kubernetes."

scq 1234 days ago [-]
That's for target OSes - developer OSes look like this: https://insights.stackoverflow.com/survey/2020#technology-de...

- Windows 45.8%

- macOS 27.5%

- Linux 26.6%

But still, macOS users represent only around a quarter of developers.

samatman 1234 days ago [-]
Which means we're looking at the question with the wrong granularity.

There are sectors of the industry where Windows is a hard requirement, ones where Macs are utterly dominant, and others where Linux is assumed and running a proprietary Apple OS gets you the side-eye.

Software development is a big tent.

hu3 1234 days ago [-]
> Windows is a hard requirement, ones where Macs are utterly dominant

I think you got it reversed. One can develop Android apps on macOS because there's no lockin like XCode. The same is not true for iOS apps which require XCode.

So if anything macOS is a hard requirement.

samatman 1234 days ago [-]
I don't have it reversed at all, mobile applications are just one sector of software development, and not a very large one.

There are entire industries (healthcare, to name but one), where Windows is simply a given, and all development targets that platform.

It's true that Macs are a requirement for iOS development, I was referring more to the observed fact that most devs in SF doing "cloud" whatever use a Mac, probably around 80%.

hu3 1234 days ago [-]
> SF doing "cloud" whatever use a Mac, probably around 80%.

A specific use case in a specifc place of the globe is your definition of "dominant"?

I knew HN was a bubble but this is hilarious.

Imagine considering macOS dominant because it's trendy in SF.

samatman 1233 days ago [-]
No.

It's my definition of a "sector", which you would know, if you had read my words with more care and sympathy.

hu3 1233 days ago [-]
The whole thread is about developer marketshare. This is how it started: "Most developers I know use already a mac."

I'm sorry but your whole "granularity" tangent makes no sense.

wayneftw 1233 days ago [-]
> most devs in SF doing "cloud" whatever use a Mac, probably around 80%.

They're just victims of group think, advertising, peer pressure or status signalling. Everywhere else in the world, 80% of people doing "cloud" whatever aren't using a Mac.

hu3 1234 days ago [-]
Ahh thanks for clarifying. Somehow I missed that section which is more pertinent to the topic.
damnyou 1234 days ago [-]
Most developers I know use desktop Linux, and M1 is not (yet) an option for them.
throwaway4good 1234 days ago [-]
Maybe I should clarify the "most developers I know".

Inside my world little circle of freelance Java / JavaScript developers 2/3 are on a mac. People buy their own work machine unlike maybe working for a big organisation where that is provided for you.

lmedinas 1234 days ago [-]
I don't get this blog, neither what's the point !?

The author lists several bugs, instead of posting the issue number with a link, he puts his Tweet with the bug numbers (without link). Im not sure what the author is trying to accomplish here. Also I guess these kind of bugs, although not nice, they are mostly because MacOS suffered the biggest change since its history and supporting 2 completely different architectures is not an easy job. Of course we are at the mercy of Apple to fix them but this is part of the deal when you buy such a machine at this point.

Then all other points are just showing all the known the same issues that some commercial SW was not yet ported to a new architecture.

Shivetya 1234 days ago [-]
Software is going to be a big sticking point for this platform and it may be more difficult outside of the professional software to bring along types of software.

Apple cedes a lot of sales by not attempting to bring into their fold all the gaming companies. Just looking at Steam alone shows the disparity and there is a lot of money to be had. The idea of just own two systems is one that not everyone can justify.

viktorcode 1234 days ago [-]
Apple runs the biggest gaming store (in terms of revenues)
worldmerge 1234 days ago [-]
Maybe I've only focused on graphics and gaming PCs but it's really interesting to see how low spec some of your work machines are in this thread, like maybe I should write more efficient code. My desktop is using 32gb of DDR4 ram and laptop is on 16gb. When I did controlmylights.net I was using almost all of the 32gbs, maxing out the gpu /cpu. I don't normally run that heavy of a production on my system but it was nice to be able to do it (2x resource heavy OBS stream setups (Twitch/YouTube), MongoDB, Redis, NodeJS application, openFrameworks). My desktop raised the temperature of the room by at least 5 degrees F, it was a mini space heater. I definitely could have made that project in something lighter, maybe using Rust, but it's nice to have the headroom. The next performance upgrade for me will be overclocking my 8700k.
pcr910303 1234 days ago [-]
Wow, HN is so much strict to Apple. I'm already seeing 4 ~ 5 top level comments on how Apple "locks down" computers and they are "taking our freedom"...

Apple have stated multiple times that they don't have any intention to lock down macOS more or less - I can't really think why anyone would think Apple would lock down macOS. There's... really no reason right?

My gut feeling is that there are a lot of people that don't like Apple, mostly due to their proprietary nature, and they just... argue against Apple. Before the M1 appeared, the argument was that Apple's Macs are expensive for nothing, they have terrible hardware, Touch Bar is bad, software quality has declined, etc... and now it's all about the user's freedom.

Seriously people. This article is about using dev tools on the M1 Mac. Let's not start arguing about how Apple is bad for freedom, etc.

politelemon 1234 days ago [-]
It's quite the opposite actually. HN has a ridiculous Apple bias (the flood of blogspam mentioning M1 should be evidence of this), and any comments critical of it are shouted down. They can literally do nothing wrong in most eyes.

NO company can, or should be above criticism. You will see comments critical of _all_ large companies in HN threads, but only comments defensive in these threads. Imagine a comment like yours on a thread about Amazon or Facebook.

I have gut feeling of my own - that many people identify with their Apple products (identity politics) and are personally offended by any criticism that the organisation receives.

--

Edit: To put it in a little perspective... I invite you to view the contemporary Amazon thread. What would happen if you posted that everyone should focus on AMZN creating jobs and hiring people rather than criticizing them?

https://news.ycombinator.com/item?id=25237552

ImaCake 1234 days ago [-]
Maybe HN just has a diversity of opinion on apple products? There is plenty of reason to critique apple and plenty of reason to praise their products. I come to HN for the insight, and it is hard to get more insight about a tech company than from a bunch of tech nerds who disagree about it!
totalZero 1234 days ago [-]
I actually wonder about that.

M1 has been out for a short time, and Apple Stores are not accepting walk-in customers. People aren't seeing their friends much due to the pandemic, so it's unlikely that those who bought the new machines are broadly showing them off.

Lots of forum praise about M1 seems like it's coming from people who have no hands-on experience with the new devices. I have no reason to believe that Apple pays people to stir up hype on the web, but I definitely sense that the vast majority of people hyping M1 haven't purchased an M1 device. My sense is that they just read some blogs, ogle some benchmarks, and regurgitate what they see.

That's not exactly a diversity of opinion. I can tell you I've been voted down a couple of times just for suggesting that there's too much hype and not enough information.

ImaCake 1234 days ago [-]
I've seen some blogs of techy people with hands on experience saying that they like the machine. I haven't really seen anyone who has it say they dislike it. But they do seem willing to jump through extra hoops to make the machine work for them.

So I think there is some reliable evidence in it's favour. But I don't think you should be downvoted for disagreeing with that. Since it definitely up for debate! Someone seems to have downvoted your comment here, and I think that is the wrong action to take on what is clearly a thoughtfull response on your part.

kergonath 1234 days ago [-]
There is some genuine excitement about a new CPU architecture, that’s quite natural. If anything the opposite would be surprising on HN. Some of these people are Apple users, some of them are hoping another vendor will follow and offer powerful ARM laptops and desktops, some of them are just happy that a breakthrough has been made and that it will stimulate the competition. There is quite a lot of diversity in these opinions, actually.

You even find the usual contrarians who moan that it’s Apple, so anyone saying anything positive have to be shills and the CPU has to be terrible (plenty of those in this thread).

Since when do he have to buy something to have an opinion? Are you saying that the people who complain about the M1 (or Macs in general) without having bought one should just shut up?

totalZero 1233 days ago [-]
> Since when do he have to buy something to have an opinion?

It's pretty difficult to have an informed opinion about a product that you've never seen/used/tested/tried.

Due to the requirements for a shopping appointment at an Apple Store (hard to come by, if you check online), and due to social distancing, it's fairly unlikely that a person who hasn't bought/received an M1 machine can have an informed opinion about it -- especially so soon after the release.

That's why I take the hype regurgitation with a grain of salt. Has nothing to do with buying the right to have an opinion.

granzymes 1234 days ago [-]
"The universal experience is that Hacker News is incredibly biased against whatever you happen to favor."
amelius 1234 days ago [-]
Another possibility: it is hard to make someone understand something if their income depends on them not understanding it.
amelius 1234 days ago [-]
Apple dictating what tools we can and cannot use (e.g. Firefox not being allowed to use their own rendering engine on iOS, but forced to use Safari internally).

This should be enough to make you look elsewhere.

But then there is the locked bootloader, the Apple tax, etc. etc.

By supporting Apple, things will only get worse.

mlindner 1234 days ago [-]
All the things you state is only for their phones, not their computers. The bootloader isn't locked at all. And in terms of the "Apple tax". It's largely a myth. If you look at individual components used, they tend to be all top-end binned parts of everything. There is some Apple upmark, but it's not nearly as much as others say. You can't get a PC laptop with the same mix of low weight, high performance, and long longevity.
saagarjha 1234 days ago [-]
> I can't really think why anyone would think Apple would lock down macOS.

Because they keep introducing new security features that get progressively more annoying to disable. Have you looked at how difficult it is to modify system files these days?

rbrtl 1234 days ago [-]
Security is the antithesis of convenience. Would you rather they send every unit out with Frontier Law levels of user protections? Every graphics artist, musician, businessperson, student, and Joe Bloggs who buys a Mac needs to study the implications of owning a computer for themselves and mitigate all the threats on their own. Suddenly Apple computers are infected with malware like Windows was in the 2000s and we just burn all the research Apple did on the right compromise for their entire user base because a few power users want to turn on a machine and debug system processes before they even log in.

I perceive a level of anti-Apple bias in HN threads, I don't think it's imagined, but more prevalent is the power user bias of people who have advanced knowledge of computer systems and want a superuser machine off the shelf.

To me, the fact that you call them "security features" and not some pejorative euphemism is a bit of a capitulation.

saagarjha 1234 days ago [-]
Some of the things Apple makes are certainly "security features" and not security features, but to your main point: Apple designed their security model to be "trust us or trust nobody" and the security model most users want is "I trust me". It is eminently possible to do that without going back to the wild west of not having any security, actually, the issue is that Apple forces you to go there if you aren't happy with anything they add on macOS.
rbrtl 1234 days ago [-]
Most power users maybe. Most consumer users don’t think about the fact they have to trust at all. And you can’t say you don’t want the security features and then say that it’s bad you have to turn them off if you don’t want them. The goalposts are Apple’s customers, my guesstimate is that 80% of those paying customers would rather be secured by the vendor they already trusted to produce the hardware, firmware, software, and infrastructure they use on a daily basis. Do you know of any market research about the level of security most users want?
chongli 1234 days ago [-]
No, most users would rather trust Apple. These are the people who drop their computer off at the Genius Bar for software problems. They never even think about rolling up their sleeves to fix a problem, never mind firing up the terminal.
kergonath 1234 days ago [-]
You have to be realistic: you already trust them if you buy their hardware.
gjsman-1000 1234 days ago [-]
`sudo spctl --master-disable` to disable Gatekeeper, and `csrutil disable` in Recovery Mode to disable rootless/System Integrity Protection. Just like that, it's a Mac from a decade ago in openness.
saagarjha 1234 days ago [-]
Neither of those help with the thing I mentioned.
jsz0 1234 days ago [-]
Almost all the security features are getting progressively less sensible to disable though. You gotta be for real Jedi to be using a computer with easily modifiable system files these days.
ehutch79 1234 days ago [-]
Why specifically, are you trying to modify system files?
saagarjha 1234 days ago [-]
I run beta software on my devices. Apple, in its infinite wisdom, believes that programs compiled on beta operating systems do not belong on the App Store. Thus, I modify SystemVersion.plist so Xcode will think it's running the latest release version rather than the beta.

This is just one of the things that you can do; I've occasionally patched system libraries, replaced resources, and so on depending on what I've wanted to do.

ehutch79 1234 days ago [-]
Do you think it's really that odd that they wouldn't want someone publishing releases from beta OS/Libraries?

Given how everything else you talked about could be turned off. Do you think that would be a better default? to allow system libraries and such to be altered? and that the protection would need to be explicitly turned on instead?

saagarjha 1234 days ago [-]
> Do you think it's really that odd that they wouldn't want someone publishing releases from beta OS/Libraries?

Yes, very. It's not like the toolchain really cares which version of XNU it's running on.

> Given how everything else you talked about could be turned off. Do you think that would be a better default? to allow system libraries and such to be altered? and that the protection would need to be explicitly turned on instead?

I liked what they did in Catalina, where you could remount the rootfs as writable. What they have in Big Sur is too much; you have to restart every time you want to modify anything and it is not possible to change anything after booting.

_dibly 1234 days ago [-]
I don't see how that's any different from Microsoft, Amazon, or any other major tech company that makes the front page of HN.

In fact the only difference I can think of is that there's always some degree of reactionary discourse when Apple is mentioned by people who feel the need to defend the company personally. You never see this type of comment on a thread full of people complaining about Windows or AWS, but you almost always see it when there are a few comments critical of Apple.

sgerenser 1234 days ago [-]
If you really want to see “reactionary discourse” check out any article critical of Tesla.
mlindner 1234 days ago [-]
Historically the attacks against Tesla have only been matched by the ferocity of political attack ads. So a lot of fans of Tesla get very defensive about them.
rbrtl 1234 days ago [-]
I think the reactionaries in any of these conversations are the ones advocating that Apple not improve security, i.e. that they don't make progress in protecting their users.

There are many reactive voices speaking out in support of Apple, but they are responding to those who complain (for the most part) from the outside, or on their way out of Apple's door.

_dibly 1234 days ago [-]
I specifically mean reacting to the criticism Apple is getting. You see the same level of criticism of any major tech company in the comments section of the front page but very rarely are those criticisms followed up by a bunch of users defending the tech company in the comment section. You're arguing a weird semantic that doesn't really have anything to do with what I'm saying.
rbrtl 1234 days ago [-]
A cursory scan of the front page article concerning Amazon turns up a few standing in Amazon’s defence. I guess Apple isn’t the only tech giant with blindly loyal fans

https://news.ycombinator.com/item?id=25240953

https://news.ycombinator.com/item?id=25237993

https://news.ycombinator.com/item?id=25237932

https://news.ycombinator.com/item?id=25237628

_dibly 1233 days ago [-]
None of these comments are reacting to people complaining about Amazon in the comments or claiming that HN is unfairly biased against Amazon, which is what the comment I replied to was about.
rbrtl 1233 days ago [-]
Very good. You got me. Here you go:

https://news.ycombinator.com/item?id=9583362

https://news.ycombinator.com/item?id=25238542

https://news.ycombinator.com/item?id=6420048

https://news.ycombinator.com/item?id=23655400

https://news.ycombinator.com/item?id=23655400

Some of these are the replies to the claims of bias, some are the original claims themselves. The search bar is at the bottom of the home page. I may have been rather childish throughout this thread, it's a defence mechanism, but it doesn't mean I'm wrong, and it doesn't mean there isn't pro-Apple bias on HN. That doesn't mean there isn't pro-<FAANG> bias around here either.

People are entitled to these opinions, and to express them, as are you and I to participate in whatever you call this exchange. It's why I still love the internet even though it feels like it's raising my blood pressure from time to time. And why I go through spits and spats of spurning all social media and then participating in controversial discussions like this and other philosophical debates.

rbrtl 1233 days ago [-]
Would you happen to know any data science? I took a course on R but I never got to use it, but this is actually starting to sound like an interesting project. There's a lot of data entry level work to do I guess. My data chops aren't great but I can handle repetitive tasks.
rbrtl 1234 days ago [-]
Yeah that’s reactive, rather than reactionary. It’s a significant definition, particularly as reactionary tends to be used pejoratively to describe the proverbial “stick in the mud”.

And my point was that the very fact that Apple is the one with defendants, among the technologists who tend to make up HN commenters, should make one wonder if they aren’t doing some of those things right, not that the supporting commenters are mouth-breathing dullards who just “don’t get it”. Apple are getting bad press right now over pushing for a bill in the US designed to eliminate slavery from supply chains, and I will join the voices calling on them to take responsibility for that. But in reference to how they control their hardware-software ecosystem for their customers, the reactionary point of view is that they shouldn’t be progressing towards a user-safe environment.

rbrtl 1234 days ago [-]
Well clearly I’m losing this debate to people who can’t even be bothered to use the words that actually mean what they’re trying to say. Snore
rbrtl 1234 days ago [-]
Cool. More cowardly downvotes. My respect for the users of this platform is plummeting. I get that I’m being snarky, but it’s just polite to tell people why their comment deserves a downvote. Am I wrong? Have I missed the point? Or have I butthurt a tender ego that can’t stand a minor correction on a trivial point in an online argument.

I’m out

_dibly 1233 days ago [-]
Maybe don't get into an argument about semantics when it's pretty clear what the person you replied to originally meant. Half the comments in this thread are reacting to the negative comments about Apple in this thread. Typically for other big tech companies you don't see that because there's a huge bias towards Apple on HN. Idk why you went totally off the rails because of "cowardly" downvotes but I assume that you received them because you were very clearly misconstruing my statement.
rbrtl 1233 days ago [-]
Maybe don't use big words you completely misunderstand in a discussion concerning a complex topic littered with political commentary and obvious bias. Particularly if you are only pointing out the obvious bias.
_dibly 1233 days ago [-]
Sorry you're so salty about downvotes that you're resorting to ad hominem and debating semantics instead of discussion, but you clearly don't have anything more productive to say so enjoy your well-founded sense of intellectual superiority I guess.

>Particularly if you are only pointing out the obvious bias.

Sure, let's pretend you didn't take my original comment completely differently than I meant it and then act like I'm too dumb to understand what I was talking about. Have a nice day.

rbrtl 1233 days ago [-]
How many times can I say: you were wrong; before you understand that I mean you misused the wrong word to make a bad point that isn’t true. Then you jumped down my throat about pointing out that you’re wrong, and deflected the criticism to my minor point that you swapped the two terms.

I’m ready for bed, I had a nice day thanks. I took a small walk, learned a couple of songs and now I’m ignoring the TV while I sketch out designs for a web service, but your deliberate continuing obtuseness is keeping me entertained for the evening.

Edit (upon reflection): I'm salty about the downvotes because I don't yet have the karma to downvote comments myself, so I can only reply expressing my negative opinion. This so often results in further downvoting that I get further away from the privilege of a silent dismissal of comments with which I disagree.

theSIRius 1234 days ago [-]
> Apple have stated multiple times that they don't have any intention to lock down macOS more or less

What Apple says and Apple does can or are two different things. And it applies to any company in the world. Blindly believing them is naive at best.

> I can't really think why anyone would think Apple would lock down macOS

I can think of one: iOS. Locking down the software and hardware can allow Apple to funnel people to their own stores and services. It is not rocket science to figure out that a company is trying to increase their revenue. Right now you have only MacOS as the operating system, Linux may never come and Windows support is anyones guess right now.

xenadu02 1234 days ago [-]
The easy path on M1 would have been to lock everything down. Use the same iBoot phones use, the same kernel configuration, and call it a day.

It takes a lot of engineering work to make more permissive security modes work. To re-architect booting. To make external drive booting work. To make developing kernel extensions on production systems work. To allow signing your own boot blob. Work that 99% of users don't even understand let alone care about.

If that isn't enough evidence then nothing ever will be.

fsociety 1234 days ago [-]
Yep this is true of all big tech companies in general. It’s a bit like when punk rock became mainstream, but now it’s tech companies - or rather was ~10 years ago. People hate change.

These new CPUs will represent a huge generational leap and dropping of old baggage. A laptop which has nearly 20h battery life, is small/portable, and is still fast? Wow.

Anything Apple delivers on for the CPUs will find its way into Linux and Windows machine.

This is good for everyone. Eventually you’ll see the next generation Lenovo’s X series or Dell’s XPS series with the same generational improvements.

breakfastduck 1234 days ago [-]
Yeah it's unusual.

Generally it's people who aren't even invested in the ecosystem anyway.

They've said this stuff for years, I can't do anything less on Big Sur than I've been able to do on any mac.

vbezhenar 1234 days ago [-]
One example of Apple locking down macOS is how they're pushing out custom kernel extensions. For example firewalls used to be kernel extensions. Now they use explicit APIs and no longer run in kernel space. One could say that it allows for more stable operating system. But on the other hand macOS now by-passes firewalls for its own services and this feature was abused to by-pass firewall with ordinary programs.
qppo 1234 days ago [-]
The margins on the App Store for iOS is enough of a reason to lock down the OS as far as they can. The difference is they can't practically lock it down to the point that app developers aren't impeded.

But all that said, given the quality of developer tools from Apple I don't have much faith that they won't impede everything but swift development in Xcode.

Hamuko 1234 days ago [-]
>Apple have stated multiple times that they don't have any intention to lock down macOS more or less

Didn't they also say that notarization was about security?

tambourine_man 1234 days ago [-]
It is
Hamuko 1234 days ago [-]
Then maybe they shouldn't try to prevent Epic Games from doing it.
qz2 1234 days ago [-]
I have a different perspective. I do all my development remotely if I can now in cloud VMs. I know that doesn’t apply to some classes of development but it’s a hell of a lot less painful isolating it and having snapshots available when inevitably you break something. The amount of hours I’ve lost due to an OS upgrade or dev tool chain upgrade breaking each other is huge so I keep them well apart.

Thus I’ve been using an 8gb M1 Mac mini doing this for a week now and haven’t had any problems. It makes a very nice terminal computer with luxury desktop. When a suitable 27” approx iMac appears with an M-series CPU in it, I will buy one. It’s nice being able to triple swipe between a Mac desktop with native apps, a desktop full of terminals to Linux VMs and a windows desktop machine via RDP transparently!

burnthrow 1234 days ago [-]
Your perspective "I use all machines as thin clients" is kind of pointless in a hardware discussion.
qz2 1234 days ago [-]
It’s not really because the hardware is quieter, faster and cooler than the old i3 Mac mini I was using. That has a direct benefit to the end user, not just on sheer local grunt. On the dev front it means it gets out of your way and stays out of your way.
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 23:20:37 GMT+0000 (Coordinated Universal Time) with Vercel.