Hacker Newsnew | past | comments | ask | show | jobs | submit | fastaguy88's commentslogin

Really not a libertarian, but why shouldn’t Netflix have the right to choose who they distribute content to? They negotiated conditions with the creators, why shouldn’t they be able to specify the DRM? No one is forcing you to subscribe to Netflix. Or even to buy an iPad.

The issue is the means of enforcement requires taking away other rights they shouldn't be able to.

What if I want to require (for anti-piracy reasons) that to use my software you must also give me complete access to your computer, all the data on it, and all your communications. You might say, "Well, if anyone is stupid enough to make that deal, let them." But it's easy to sugar coat what you're doing, especially with less technical users. I think it's better to say, "That's just not something you are allowed to do. It's trampling on rights more important than your anti-piracy rights."

In the same way, you cannot murder someone even if they agree to be murdered (an actual case in Germany).


> What if I want to require (for anti-piracy reasons) that to use my software you must also give me complete access to your computer, all the data on it, and all your communications.

That's exactly what happens with anti-cheat kernel modules. As one might expect, ordinary people couldn't care less, as long as it works good enough.


Except that... we have history of them not working well. For instance, the Sony rootkit https://en.m.wikipedia.org/wiki/Sony_BMG_copy_protection_roo...

We cannot expect those rootkits to be properly supported long term for any security issues they may cause. I would think that the solution is simple: nobody forces them to make their IP available in non hacked computers...

If they want a hardened computer to deliver their IP, then they should sell their own hardware. But forcing their blocking into the whole stack is not acceptable.

For instance: I cannot see any udemy or netflix content from my computer, because their IP protection blocks the lenovo docking station I use to connect my monitors to my MBP... each part is standard! And somehow nobody tested that scenario. So, no, that tech is barely tested, it must not be forced into any computer.


Forgive me, but is Netflix asking for that?

As I understand it, Netflix wishes to authenticate the device, and DRM their content. I'm not aware of anything beyond that (but I'm also not paying attention. )

Now you may have used the example of what might happen, but then Netfix seems a strange example. Surely Apple and/or Google are more likely players in that example?


> Now you may have used the example of what might happen,

OP said "What if", it's clearly a hypothetical scenario and not something Netflix is doing or planning to do


For Netflix sure. I don't care. But when it comes to banking and you are forced to use between two OS or this means no access to your bank digitally, this is a massive problem and restriction to citizens' freedom. Everyone needs a bank to operate, and they need to maximize the options available to use them.

I mentioned that in another thread, but banks have a legal obligation to to assess and mitigate risks in the service they give to you- you, personally, might be tech savvy enough to understand what you are doing but most people are not and the bank is held accountable when something bad happens.

This is why they limit service to certain devices or OS versions, even when it comes at the expense of convenience.


Perhaps the solution then is to invent a new bank that is more resistant to regulation and gives users more freedom to secure their own funds.

> legal obligation to to assess and mitigate risks

It's obviously not about risks. It's about convenience on their side to only support 2 platforms and call it a day.


well no one to force you to do banking from smartphones

You can do manually like the old days, EXPLICTLY ALLOWING NON GOOGLE/APPLE to do banking in their own mobile phone meaning THERE ARE MILLIONS OF USERS that can fall victim to scammer+cracker

how cant you see all of that???? ITS JUST NOT ABOUT YOU

edit: please educate first, y'all need to know differences between mobile banking and internet banking

You can downvote me all you want, but I don't want to hear lecture from non-security compliant engineer about what to do about security


Locking down a website to only be available to users on Apple and Windows doesn't make it safer. It just reduces the cost of building it because you don't have to bother testing it on any other platforms. Rather than tell users "Danger, we haven't tested your choice of OS" companies prefer to lock it down.

Users on Apple and Windows are not safer because a bank has chosen to block Linux.


[flagged]


Until they decide to force you to use the mobile app as a 2FA for the website. My bank did that, I literally had to buy a new phone because the old one couldn't update their stupid app. It locks you in to the latest N versions of Android/iOS.

Before you ask, no, other banks aren't any better where I live. They all stopped using physical 2FA keys years ago. And no, they won't let you come in physically for things that can be done online.


good for them to care more about security then

My bank lets me do everything just fine on Firefox/linux.

For now, until they come up with some stupid 2FA solution that requires installing and updating their Android/iOS app. Banks where I live already have and there's literally no way around it (they don't use physical 2FA keys anymore).

its not mobile banking if you use browser

its just browser/internet banking

also mobile banking has much more capabilites in forms of app than just "web page"


Because it's bad for consumers to lose choices, even if they don't normally exercise those choices. The choice is the distributed power we have against the consolidated corporate power. We can choose not to let them restrict those choices, for example with interoperability regulations.

>why shouldn’t Netflix have the right to choose who they distribute content to?

power asymmetry


There are dozens of sources of online streaming entertainment, and its not exactly a vital good.

Sure, Netflix may not be as important as, say, housing, food, or whatever else, but I think there is something to be said about the cultural importance of [at the very least some] film and television.

There's a lot of media worth studying, analyzing, and preserving. And in that sense, between the constant churn of catalog items, exclusive content, and the egregious DRM, I think these sorts of streaming services are, unfortunately, kind of harmful.


Doesn't your second paragraph run against the grain of your first? If streaming services like Netflix are harmful then we should avoid using them. Thus it should not be important for our freedom-preserving computers to be able to access Netflix.

Now, if you want to do an in-depth study of film and television material as a whole, you're actually better off avoiding Netflix and making use of archives such as public libraries, university libraries, and the Internet Archive.


I mean, I agree that you should be able to avoid things like Netflix and make use of libraries and other archives, but that's sort of the point; there is a ton of media that never even gets a physical release anymore; once one of these platforms goes under, or something enters licensing hell, or whatever else and gets removed, all you can do is hope someone out there with both the know-how and access went out of their way to illegally download a copy, illegally decrypt it, and illegally upload it somewhere.

I say "know-how" and "access" because, while I'd still argue decrypting, say, Widevine L3 is not exactly super common knowledge, decrypting things like 4K Netflix content, among other things, generally requires you to have something like a Widevine L1 CDM from one of the Netflix-approved devices, which typically sits in those hardware trusted execution environments, so you need an active valuable exploit or insider leaks from someone at one of the manufacturers.

But also on top of all of that, you also need to hope other people kept the upload alive by the time you decide to access it, and then you also often need to have access to various semi-elitist private trackers to consistently be able to even find some of this stuff.

The legal issues with DRM here are hardly exclusive to Netflix and other streaming services, but at least in the case of things like Blu-rays or whatever — even if it is technically illegal in most countries to actually make use of virtually any backed-up disc due to AACS — you usually don't have the same time-pressure problem nor the significant technical expertise barrier.

>If streaming services like Netflix are harmful then we should avoid using them. Thus it should not be important for our freedom-preserving computers to be able to access Netflix.

I generally do avoid them whenever possible, though, yes. And I've explicitly disabled DRM support in Firefox on my computer. But I am just one person and I don't think my behavior reflects the average person, for better or for worse.


>decrypting things like 4K Netflix content, among other things, generally requires you to have something like a Widevine L1 CDM from one of the Netflix-approved devices, which typically sits in those hardware trusted execution environments, so you need an active valuable exploit or insider leaks from someone at one of the manufacturers.

Or just use a cheap Chinese HDMI splitter that strips HDCP 2.2 and record the 4K video with a simple HDMI capture device.

But if you are talking about preserving media or making media accessible, then it's not like we NEED 4K.


Yeah, there are a lot of torrent sites! Netflix doens't want my business anymore, I don't really care.

There exist dozens of online services where you can store your photos, doesn't mean companies should be allowed to do whatever they want with your photos...

TBH I don't care if Netflix wants to abuse such an asymmetry. I don't need Netflix in my life, so I'll just cancel my subscription(already have). I honestly don't want my lawmakers to spend even a second thinking about Netflix when we have so many large issues in the world right now. If we were talking about something like financial services where I have to engage I would be more sympathetic.

Capital doesn't really care what you want, it will exert control regardless. So in this case Netflix will continue to be part of capital that normalizes the need for DRM to access videos, write IP law, and generally force you into either accepting the world they want or forcing you to become a hermit.

Edit: i mean to say this is true whether or not you've even heard of the company.


Well then I will get mad when that actually happens. Until then don't care.

The whole notion of DRM and penalties if you circumvent it comes from the entertainment industry, and it's written into law/official treaties. This already affects everything from secure boot to HDMI standards.

Which part of what I said do you think hasn't already happened and metastasized?

> Capital doesn't really care what you want, it will exert control regardless.

Working as intended. The market doesn't care what capital wants either.

> So in this case Netflix will continue to be part of capital that normalizes the need for DRM to access videos

I can access video without DRM. If you want to access Netflix's service that's on you.

> write IP law

Netflix does not write IP law, our politicians do. Vote better.

> generally force you into either accepting the world they want or forcing you to become a hermit.

I don't accept their world, and I'm not a hermit.


...and it will be too late.

It's sort of antitrust adjacent. They are big enough to set market rules on the manner of distribution, like DRM and hardware-software lock-in, which doesn't directly stifle competition in their field (only a little) but in another field, and the results are arguably anti-consumer. That sort of power should not be in the hands of a single company.

A non libertarian might ask: Is it good for society?

This is an interesting insight. The OP's constraint that no two adjacent squares are the same color ensures non-randomness. (Which reminds us why people are so bad at producing "random" sequences.)


Yeah, it’s a funny coincidence that all those constraints to make it look random produces exactly one solution. I guess the OP knows this is not ‘random’ in the mathematical sense.


PG is an excellent writer, but this essay seems remarkably misleading. The unstated premise seems to be that well-educated adults already know everything they want/need to know about everything, which is silly. I'm older than PG, and pretty well educated, but I am constantly learning new things. I don't think it's because they are not important or I am obtuse. I think it is because I am (still) intellectually curious.

Sometimes I learn new things because they are new. And sometimes I learn new things (that are well known to people in other fields) because while I know a lot about some things, I know very little about others -- so little that I don't even know those things overlap with my interests.

Those of us who enjoy learning appreciate that we will never know everything we would like to, and in fact we will never know the boundaries of knowledge for topics we care a lot about. It's not that it is unimportant to us, it's just that we hadn't learned about it yet. That's why we read essays.


In the protein annotation world, which is largely driven by inferring common ancestry between a protein of unknown function and one of known function, common error thresholds range from FDR of 0.001 to 10^-6. Even a 1% error rate would be considered abysmal. This is in part because it is trivial to get 95% accuracy in prediction; the challenging problem is to get some large fraction of the non-trivial 5% correct.

"Acceptable" thresholds are problem specific. For AI to make a meaningful contribution to protein function prediction, it must do substantially better than current methods, not just better than some arbitrary threshold.


Not a lawyer, but there are a lot of crimes that are not felonies. Speeding 10 mph above the limit in a 65 mph zone - not a felony. Reading hacker news for an hour during work time and not being paid $800/hr - not a felony. Calling in sick when you are hung over - not a felony. There is no federal tax on gifts for the giftee. Indeed, I suspect there are a surprising number of crimes that could get you jail time that are not felonies. Insider trading - it’s a felony, which is why people in companies with insider trading information are told they cannot trade at certain times.

I’m pretty comfortable believing I have probably not committed more than two or three felonies in my life. (Don’t want to find out I am wrong.)


How much of that is them being categorically not a felony and how much is prosecutorial discretion? 15 over probably qualifies for a reckless operation charge of some kind. Likewise, I wouldn’t be surprised if a fake sick day is wire fraud even if it never actually gets charged that way.

I would believe you’ve only committed two or three “name-brand” felonies, I’d be surprised if it were really that few under a maximally scoped prosecutor. Never borrowed antibiotics or a painkiller from a friend? Never decided it wasn’t worth the effort to file a tax document for $3 of dividends?

3 a day feels high, but I wouldn’t be surprised if it were double digits a year under an incredibly strict reading of the laws for the average person.


Those are not crimes either, though.


The claim is specifically three "felonies" a day, though.


Let's see: C elegans -- the worm no computer scientist can crack S. cerevisiae -- the yeast no computer scientist can crack E. coli -- the bacterium no computer scientist can crack HIV -- the virus no computer scientist can crack

Has a computer scientist cracked any complex system that was not engineered?


If you read the Final Scientific Integrity Policy, included towards the bottom is the statement:

"and under “Protecting Scientific Processes,” a statement noting that early termination of extramural awards is prohibited except under certain specific circumstances."

Clearly, this is not a policy that the current administration commits to.


I am very puzzled. Apple has locked you in a cage because you bought an iPad to replace a MacBook? What is the cage, and why weren’t you the jailer?


I mean the author explains his thoughts in the article, you should read it again.


The author's thoughts don't make sense, though. He's expecting a locked-down tablet appliance to suit the same needs and use cases as a laptop running a general-purpose OS. Or at best, he's not expecting that, but is at least complaining about it, which feels a little pointless.

He can get what he wants by buying a new Mac, as he suggests. It's not like what he wants doesn't exist. He's just complaining that some other random product doesn't do what he wants, even though it's not designed to. Pointless.


"Even though it's not designed to"

It is designed to, though. That's the thing. The line is arbitrarily drawn at not getting CLI/root access to your iPad.

His point is that over the years, Apple has blurred that line a lot. You can use keyboards and mice. You can do all your daily computing on an iPad - email, spreadsheets, YouTube, whatever.

But it's still locked down, for whatever reason, despite being a perfectly capable computer that doesn't necessarily need to be.

It's honestly really obvious what he's saying. iPads have changed over the last 5 or so years, and people on HN clearly haven't used one in a while. The author isn't _wrong_.

Apple spends all this effort to blur the lines between personal computer and a device you can compute on, and it mildly tricks users who don't necessarily realise there's a difference between the computer and the tablet, especially amongst younger generations who grew up on tablets ("iPad kids").


> It is designed to, though. That's the thing. The line is arbitrarily drawn at not getting CLI/root access to your iPad.

An XBox's hardware is designed to run general purpose Windows software.

However, it's been clear for a decade that Microsoft is selling the XBox as a game playing appliance, and has no intention of allowing you to run general purpose Windows software on it.

If you choose to buy an appliance instead of a computer, that's your call.

You gain ease of use and freedom from having to manage device complexity, but lose the ability to do whatever you want.


Just because it's commonplace doesn't make it any less hostile to users. The tradeoff argument is legitimate, but it would be easy enough to have a yolo-mode button somewhere that voids the warranty and unshackles the user.

This is why I prefer Android. Google is evil, sure, but at least they don't treat me like a child. If I want to take one of their devices and shoot myself in the foot with it, that's fine with them (and thanks to nix-on-droid, there's plenty of ammo for such adventures).


> Just because it's commonplace doesn't make it any less hostile to users.

Sure, game consoles are user-hostile. They're also great for playing games, and they tend to "just work" with less configuration and customization than a typical gaming PC.

Less configuration tends to mean fewer problems and easier tech support, but the primary business reason game consoles are locked town is to make it harder to play unlicensed commercial games on them.


It seems you're advocating for the benefits of having a door when the objection is to locking the door.

By all means have a some kind of verified/sealed mode and refuse to support anything that's not in that mode--but there are negative consequences to normalizing a lack of control over the technology that people interact with.

Take the crowd strike incident for instance. Millions of people unable to do jobs that they're relied upon to do, and we can't even hold them accountable for that because it turns out they were never in control of their tools in the first place--locked out of the section necessary to carry out the repair.

You wouldn't tolerate a screwdriver that refused to be used to pry open a paint can. I don't see how it should be any different with a phone. I want to be able to rely on users of tools--not vendors of tools--to do things, and I can't. Not because the people are authentically incompetent, but because some vendor has made a dumb decision about what they're now not allowed to do.


Crowdstrike is software that IT departments install in an attempt to mitigate the security threats that come hand in hand with having the freedom to shoot yourself in the foot.

In thus case, it was Crowdstrike that shot them in the foot.

Managing complexity has a cost that some people don't want to be bothered with.

They are allowed to choose an appliance instead of a PC, even if you would make a different choice.


> but the primary business reason game consoles are locked town is to make it harder to play unlicensed commercial games on them.

Which is user-hostile. The user bought the hardware, so they should be allowed to play whatever they please. Hiding the true cost of the hardware by inflating game prices using licensing fees is monopolistic and an attempt at misleading the consumer.

This is the exact same business model as printer companies reducing the price of printers by inflating the price of printer cartridges and locking down the ability to use third-party ones. It is unbelievable to see people on a site called "Hacker News" defending that business model.


If there were no way to know in advance if you were buying a gaming appliance or a gaming computer, you might have a point.

Some people prefer the simplicity and reliability of an appliance.

Some people embrace complexity in the name of having the freedom to do anything (including the freedom to shoot yourself in the foot).

The notion that consumers shouldn't be allowed to make decisions that are different than your own... THAT is user hostile.


> The notion that consumers shouldn't be allowed to make decisions that are different than your own... THAT is user hostile.

Having the choice is fine, but if there's no way to opt-out then it's not a choice. While far from perfect the Xbox One is a good example of the video game platform that offers an opt-out. And it works, it is one of the most secure gaming consoles on the market and yet it still offers consumers the ability to create their own game software for it.


There is, in fact, a very simple way to opt out of buying an appliance.

Don't buy it.


At some point that means not buying any modern technology because it's all been locked down. We're already most of the way there.


There are plenty of other choices.

Linux isn't going anywhere.

You just don't think people should be allowed to make choices you don't approve of.


> You just don't think people should be allowed to make choices you don't approve of.

This is what Apple believes, not the person you're replying to?


> Hiding the true cost of the hardware by inflating game prices using licensing fees is monopolistic and an attempt at misleading the consumer

R&D is undertaken with the expectation of future reward, often including licensing fees. However, if we consider component and manufacturing costs, those have always been included in the price of the Nintendo Switch. Even the disc-based PS5 became profitable by that metric after 8 months, during a global pandemic with supply chain shocks.


Most of the time people aren't stupid, if they care about where they are spending their money they aren't buying devices on the split of a second and complaining later.


People aren't stupid but there is no way to escape this bussiness model unless you go to PC. And the PC is only begrudgingly still an open platform. If something is ever going to successfully replace the PC it will be a walled garden as well.

I am appalled by how easily people dismiss the importance of open platforms as insecure and inconvenient. How will people ever learn technical skills if all the technology they own is locked down and glued shut?


They buy Android, Windows, Jolla, Pinephone,.....


How long before Android and Windows are also walled gardens that are completely locked down? They're certainly moving into that direction.

If you buy a device you should be allowed to fully own it. You shouldn't be forced to buy niche inferior alternatives or pay a huge premium.


One more reason to support Linux and BSD OEMs.

The Tuxedo, System 76, iXsystems, Pinephone, Jolla, SteamDeck (with native apps, not Proton),... of the world.


He says in the article that "But even if I could somehow get macOS running on my iPad Pro, would that resolve this tension? I don't think so. A tablet lacks a keyboard and trackpad and even if I buy models designed for the iPad, tablets are all about push, poke, and drag."

So in the end, even if he could get CLI/Root access to his iPad it wouldn't matter anyway because of their perceived quality of the iPad accessory peripherals, that are built into the Macbook.

So he should just buy a damn Macbook. He wants an iPad that runs MacOS and has a quality keyboard and trackpad, so buy the product that has all of those things built in and don't complain you can't jerry rig the iPad to do the same.


This topic comes up a lot when Apple release a device with the “pro” label.

So sure it contains an M2, but it’s also a fanless device that combines its entire componentry, including the screen, into a package that is just a few mm thick. On top of that it also has a much smaller battery.

That kind of heat envelope makes it suitable for burst work, but poor for enduring workloads. Unlike the Mac it’s not going to allow limitless multitasking while exporting video and other processor heavy tasks. This hardware limitation is recognised in the types of software that the device runs competently.

So I partly blame Apple’s marketing, but I also think caveat emptor - why would the buyer assume that all of these other Mac shapes and sizes exist if they can actually all be squeezed into a 5mm thick enclosure.


> But it's still locked down, for whatever reason, despite being a perfectly capable computer that doesn't necessarily need to be.

Consider that some people (and I guarantee you that they vastly outnumber the "my iPad should run macOS/Linux and be a full laptop-equivalent" crowd, probably by several orders of magnitude) may want a locked down perfectly capable computer if it means they don't have to waste their time and brain energy on dealing with things outside of their goal.


Your concept of designed is very different from mine. The iPad is capable of providing a shell interface, but it is clearly not designed to. It is designed to provide a secure media consumption experience. There is nothing arbitrary (from a mass consumer security perspective) about not providing a shell. Providing a shell makes it much easier for bad actors to dupe unsophisticated users.


> and people on HN clearly haven't used one in a while

That applies to a lot of the tech that's talked about here.


My understanding is that the post is essentially a complaint about deceptive marketing.

iPads are marketed as capable of "doing all the things you love", how it's "so versatile it's up to any task" and so on - leading to perception that it's like a computer, except in a different form factor. While, in reality, of course it's not.

Personally, I never found any subjectively meaningful use case for iPads except for portable media consumption (aka watching movies on the Porcelain Throne, and even then it's not a great option as it lacks multi-user support). Every time Apple announces a new one, I have this feeling of cognitive dissonance between what the device actually is and how it's marketed.

(I'm sure there are lots of good use cases for iPad - just nothing I personally need or care about. Aka "I'm not in the target market")


Apple is marketing to everyone and not just the HN crowd. For many people their phone is their only computing device and for them an iPad is likely a big upgrade. When I have to proof photos or do a bunch of office type work, an iPad has become my goto device. When I'm programming, not so much. But I don't think that's a big secret to this crowd.


I agree overall, but I don't think it's a secret to anyone, except, possibly, for people new to iPads, who haven't researched about what it really is.

YMMV, but I don't find this situation funny or deserving a sarcastic remark. What happened is that a person had seen an ad for a device with very good hardware specs that they cannot use because it doesn't work for them software- and policy-wise, and they're unhappy about it. I can understand if that person would make fun of their unhappiness (a perfectly valid way to handle the discomfort), but I wouldn't make fun of them as a bystander.

I think it's perfectly natural and expected to voice discontent if you saw an ad but the product wasn't a good fit for you and ad failed to disclose it (for obvious reasons, but still creating a conflict of interest). Especially because Apple is marketing to everyone, including software developers.


Are you telling me people use computers for different reasons? No I won’t believe it. My specific workflow is all that is important and if that doesn’t fit to every form factor of device out there, then the manufacturer of that device is a fraud who is deliberately trying to spite me.


> then the manufacturer of that device is a fraud who is deliberately trying to spite me

That's a weird conclusion, even for a sarcastic remark.

Is it a fraud - is the marketing deceptive? Possibly, yes, I personally believe it can be said so. I think it could be the case because it fails to mention the nuance that iPads aren't a good device for certain tasks. If an ad says "for everyone" without any asterisks and small print to it, it's valid to complain that this is not true. It could be the norm, but it doesn't change the fact that there's a gap that may affect uninformed person's decision making towards purchasing the device that is not a good fit for them. I mean, anecdotally that's what happened to the author after all.

Is it a deliberate fraud, though? I haven't been in a room when that ad was discussed, so I cannot possibly tell. I'm not versed in marketing, but I believe I've heard that it's quite a common practice to not include any negativity (aka "when or why you wouldn't want our product?") from marketing materials, for the money doesn't smell. I have respect to the people and companies that do so, for I perceive it as a signal they respect me (aka not wasting my time researching). But whenever it was actually discussed and dismissed ("Should we mention if developers should rather get a Mac? Nah, we want them to buy an iPad too, even if it's useless to them!") or if the idea haven't even been mentioned (e.g. if it's simply not a thing in Apple culture) is unknown to me.

Are they trying to spite anyone? I don't have any evidence that suggests so, so I find it highly unlikely. While Apple has different system of beliefs and values, drastically different from some freedom-loving software crowds, I don't think I've seen signs of any significant deliberate hatred towards those who don't share their values, or willingness to make their lives worse somehow. There could've been some less than great attitudes (but my memory fails me here, I only have a vague idea that I might've possibly heard or read something that didn't resonate well with me), but I don't recall anything seriously hateful.

All this said, I would love for us all have more discussion about ethics in marketing. Honest, open, and ideally without any sarcastic remarks (for they rarely help and frequently discourage civilized discussion).


The problem is one of self awareness, or rather the complete lack of it. The values I see extolled in matter of fact tones are a consistent feature of not just these forums, but are particularly noticeable here.


"He's expecting a locked-down tablet appliance to suit the same needs and use cases as a laptop running a general-purpose OS."

Which is a perfectly valid complaint since it doesn't need to be locked down. It's an entirely artificial limitation. An iPad pro is like selling a Ferrari that can only turn left at 30mph.

I don't understand the "buy something else" mentality. Freedom of computing on hardware you own used to be core to the tech community, but I guess it's now cooler to side with monopolists.


An iPad doesn't have the thermals of a laptop. They are not the same device.

People who want root on everything and a command line are a tiny minority of the population.

Most users don't even know what root is. They don't want to know. It doesn't interest them. They don't find it a limitation because command line computing is something they're actively indifferent to.

Having a powerful engine (for burst computing) changes nothing. They still don't care. They want something that runs CapCut or whatever, and that's the extent of their interest in technology.


I read it and don't understand it at all.

He's bought something whose limitations and target demographic are well known.

And then complaining when he hits one of those limitations.


You've never bought anything you knew could be improved?


To paraphrase MKBHD, only buy devices for the features they have right now. Otherwise you will often be disappointed.


That's independent of wanting obvious limitations removed from a device that is already otherwise useful for you.


He's complaining because Apple of the year have started to blur the line between a personal computer and a device that you can personally compute on.


The infuriating thing is that a huge number of those limitations exist as a few bytes that Apple does not let you flip, somewhere in the OS's code or configuration.

It would be unreasonable to buy, say, a flip phone in 2007 and complain that it can't replace a graphics workstation. It didn't have the hardware for that workload. Today, it's the artificial nature of the limitations that make the device akin to a cage.


Yes because flipping those bits would make the iPad a Mac.

And the whole point of an iPad is that you have a simple and consistent experience.


I don't recall which WWDC it was.. They had a slide with the largest text they ever used.

The question was "will ipadOS and macOS merge?

"NO"


Found it.

it was iOS and macOS but still...

https://youtu.be/DOYikXbC6Fs


For a phone, you really do want a different interface. (unless you're tethering it) That's much more justifiable.


Because that would either mean locking down MacOS or opening up iOS. Apple would love to lock down MacOS, but it'd be incredibly bad press to admit to that publicly.


>a simple and consistent experience

Has that not been Apple's entire marketing thrust since 1984?


It's fine that Apple wants to ship with those bits unflipped. What's unreasonable is that Apple stops you from flipping them.


I think it is worth pointing out that Bell Labs was an Engineering research lab. Scientists there created new disciplines, but in the service of relatively well defined engineering goals.

Contrast this with the NIH, where the science also has a goal - improving human health - but the system to be improved was not engineered. Curing a disease, which has a natural origin, is quite different from improving communications channel capacity.

I suspect that managing engineering research is much more amenable to process analysis than research on biological systems.


It might be insane, if you believe that "staff" are all doing administrative duties. But, as was pointed out, "staff" are often anyone who is not a tenure track faculty. So librarians, research technicians, environmental health and safety, IT support, etc etc.

A more useful comparison would divide staff into "supported by tuition" (should be related to student count) and "supported by external grants and clinical income".

This idea that costs have increased because of administrative staff expansion is a popular one, but one that ignores what R1 universities spend money on, and where that money comes from. (Ironically, I suspect that the university may be spending more money on research, because of limits on indirect costs.)


I think, however, the total count is extremely important.

Every University’s purported mission is to educate students and advance our collective knowledge together with its students.

That’s it.

If the university makes more money from treating patients than teaching its students, then its mission can’t help but shift.

Likewise if the bulk of the staff are not focused on teaching and educating, then its mission can’t help but shift.

This is a problem.


> Every University’s purported mission is to educate students and advance our > collective knowledge together with its students.

> That’s it.

Not if the university has a medical school. Virtually all R1 universities with medical schools have a hospital, and a large clinical practice. Most of medical school is an apprenticeship where you treat patients. Medical schools need patients, which means a lot of additional staff.

Likewise, in most fields it is no longer possible to advance knowledge just by going to the library or writing on a white board. Knowledge is advanced through experimentation, and experimental equipment and reagents cost money, and need staff to use and maintain them.

No university (and certainly no medical school), makes enough money in tuition and fees to pay for the education provided, and I seriously doubt that many universities have supported themselves solely through tuition since the beginning of the universities in the middle ages.

You are certainly correct that university deans and presidents have seen their mission shift with the increasing cost of education, and indeed faculty are writing many more grants than they did 75 years ago. So time commitments have shifted. But there is an implication that it could have been some other way -- that the money is there (or could have been there) if some other path were chosen. It is hard for me to imagine where the money might have come from.


Is it? I disagree. The university I went to has a mission to “conduct research, provide education, and engage with the community to improve the lives of people and the environment”. MIT’s is to “educate students and advance knowledge in science, technology, and other areas of scholarship”.

It’s not a problem. You just have a narrow view of what you think our higher ed institutions should be.


You’re certainly welcome to disagree. But I strongly disagree that it’s not a problem.

If higher learning isn’t the core mission, then there are better ways to advance knowledge and improve the lives of people and the environment.

Per https://jsri.msu.edu/publications/nexo/vol-xxii/no-1-fall-20...

> This has had several consequences for the governance of universities: 1) the role of shared governance has receded in importance in the day-to-day governance of universities; 2) the balance of power and authority has shifted toward administrators; and 3) faculty have been subjected to a series of performance measures that disproportionately values productivity over shared governance participation.

Publish-or-perish and shoddy research is a direct result of this shift in the mission, as measurements became all but expected.

By the time I entered uni the 1990s, things were shifting negatively in higher institutions.


I've worked for both big R1 Universities as well as top-tier independent research institutes. I ran computing facilities that supported bioinformatics facilities, and spent my day interacting with both research leaders (PIs/faculty) and administration.

I don't believe point #1 - I have been involved in shared governance bodies as a student and staff, and at least where I've spent time, these bodies are strong.

For point #2, I never saw any shift of authority to administrators. In fact, I left academia because I was given a mission to centralize computing resources to ensure we're responsible stewards of the data we held. Instead, PIs would end-run around shared computing facilities, spending their own grant money on high end workstations, USB drives. I left and went into big tech because I was tired of fighting with essentially 50-100 small fiefdoms. The administrators were powerless, and if they tried to force the PIs to submit, they PIs would simply go someplace else.

For #3, while "impact factor" took on a larger role, I did not see a problematic shift in how we did science. Everyone was given adequate resources to participate in governance. If anything, the outsized influence individual PIs had over how they did their research made it more difficult to ensure data was stored safely, analyses were reproducible, and so on. That, to me, is a greater risk than the fear that administration was telling researchers what to research.

There are problems with higher ed in the US, but I don't understand how to equate a perceived shift away from "shared governance" with deep fundamental issues in the mission of our higher ed system. We need both a focus on educating young people (need to have fresh minds and bodies to keep the research machine churning) as well as basic AND cutting edge research to keep progress moving forward.


For #2, are you saying the administration doesn’t hold the purse strings?

Why the resistance to the top-down approach?

Nobody resists if it means more resources; and faster procurement of resources.

Instead it seems researchers are forced to navigate politics and raise funding.

Or do you mean that each department has its own IT department and it’s resisting consolidation?

For #3: It’s not about resources but about how “impact factor” is measured, and whether it’s useful a useful metric.

Often, for example little attention is given to confirmation of a suspected dead-end. That still requires in-depth knowledge of the subject, is still research, and advances knowledge.


For #2, yes - the administration doesn't hold the purse strings. Each PI gets their own grants, and thus can control how much of the money is spent, barring overhead. I had to make a value proposition for the PIs to explain to them why they couldn't afford NOT to modernize their data storage. Unfortunately, it's cheaper to go to Staples and buy a USB drive than it is to pay for properly archived storage.

The resistance to the top down approach was, to me, misunderstanding the risks of storing their data outside of a safe place, and a fear of losing control of their data.

The last institute I worked at was focused on basic biomedical research - dead ends were what we chased all day!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: