We don't like it because it can cause widespread damage to the ecosystem, not just to us. Life didn't evolve in a vacuum, and the balance of natural ecosystems can be fragile.
> [Feral pigs] are invasive and cause millions of dollars in agricultural damage each year, rooting and trampling through a wide variety of crops. They prey on everything from rodents, to deer, to endangered loggerhead sea turtles, threatening to reduce the diversity of native species. They disrupt habitats. They damage archaeological sites. They are capable of transmitting diseases to domestic animals and humans.
> Interestingly, in their native range in Europe and Asia, pigs do not cause as much ecological damage. In fact, some studies indicate that they may modify habitat in important ways for species that have evolved with them, such as frogs and salamanders.
With Tango/Hi-Fi Rush, MS decided to make it free with Game Pass at launch, which obviously hurt its sales.
If you sell chips but charge people $1 for two weeks for unlimited chips, then just $10/mo for unlimited chips, you might be disappointed with direct chip sales.
I have one. The keyboard is quite comfortable, even with my large hands. The screen is both high resolution and almost square, making it quite comfortable to code on. The touchpad is a comfortable size and responsive.
I think the size and proportions are ideal for a small, thin laptop. Conversely all of the large tablets I've tried have had uncomfortable detachable keyboards and OK at best touchpads.
For the PlayStation 3, another thing driving Linux's removal was that organizations were buying PS3s to use in data centers. But Sony sold them at a loss (IIRC over $100/unit loss, even being sold at $600 twenty years ago) and intended to make it up on revenue from games sold. They also didn't have nearly enough at launch because they were using a brand new processor (designed with help from IBM) and a brand new Blu-Ray disc format. That slowed down their initial production, and it had to launch in time to compete with the Xbox 360 (Pre-Christmas 2005), which also had a HD-DVD attachment.
Notably, the PS3 at launch was actually cheaper than some dedicated Blu-ray players, so some home theater enthusiasts bought them (few to no games), eating into Sony's revenue even more. It took several years for the PS3 to be profitable, but it won the format war.
They should have sold a datacenter version, without the optical disc, with a copy of AIX and/or BSD, with an OS license of $200 (or at least under $1,000).
It doesn't look like a workstation was ever produced, just blades and embedded.
A hypothetical Datacenter version might have skipped the optical drive, perhaps cutting down on their losses. At launch I doubt they would have had enough spare processors to build them without cutting into their PS3 production capacity.
Additionally many game studios reported difficulty in writing for the PS3's processor compared to the Xbox, which had gone with a processor pretty similar to most PCs at the time. So by the time they had ramped up the capacity to make enough to meet PS3 demand, there were several years of worse/less performant PS3 versions of games developed for multiple platforms, making Sony's console look worse. It would have been a tough sell internally for the PS4 to use a custom processor architecture again, so they opted not to. This must have soured Sony's opinion of the Cell processor overall.
Cell used the PowerPC isa as did the Xbox 360. Both were designed in the same IBM facility but separated by a floor. IIRC the Xbox team indirectly learned from the Cell team's mistakes at the process/microarch level.
Cell was definitely more weird to code against and Sony put max theoretical perf above Xbox's approach to be more general purpose chip architecture.
So strictly speaking it wasn't like most PCs at the time in the x86 sense but in the three mostly same cores for Xbox vs custom Cell and special ways to squeeze out performance.
There's a book about the development of both of them, The Race for a New Game Machine: Creating the Chips Inside the XBox 360 and the Playstation 3 by
David Shippy & Mickie Phipps. [0]
It has some details on the awkward position the IBM developers were put in.
Rodrigo Copetti's excellent articles on the PS3[1] and Xbox 360 [2]
and Ars' Hannibal's article on the Xenon Chip. [3]
Sort of. The xenon cores are pretty damn close to cell PPE cores, just with VMX-128 strapped to them. They even share some taped out blocks, and have almost all of the same microarchitectural issues like the load hit store penalty.
Sony should have realized that the Cell/Power was needed in great quantity, and insisted upon a second supplier. Motorola started making the PowerPC 601 in 1992, so a secondary foundry was absolutely available.
From the wiki: "The introductory design is fabricated using a 90 nm SOI process, with initial volume production slated for IBM's facility in East Fishkill, New York."
AMD did this for Intel up to the 286. Sony should have insisted upon a tidal wave of wafers, if needed.
IBM sold that as BladeCenter QS series. Reportendly there even was an prototype version that had essentially same hardware as PS3 including the optical drive (production ones are dual-CPU without GPU and optical drive).
FWIW, early designs for the PS3 also were missing the GPU in lieu of another Cell chip. The end of Dennard scaling meant that they didn't hit the clock speed they originally intended (close to 5Ghz). It was originally supposed to do rasterization in software on that second cell, but relatively in the last minute they needed to strap a dedicated GPU to it. That's why Cell reads out of VRAM are incredibly slow, around 16MB/s.
While true, Its important to note that the second Cell wasn't gonna be the same type as the main CPU cell. The cell they intended to use as the GPU would have only 4SPEs(vs 7 in the main Cell)along with various rasterization components. Would have been cool. Settling on the Cell was compromise after compromise.
Iirc the PS2 also got a huge boost by being a cheap alternative to designated DVD players, but it eventually obviously paid off. I wonder how that factored into their PS3 strategy
Indeed, although Sony hadn't created the DVD format (Phillips), having it did boost sales of the PS2. With the PS3 Sony was pushing its own format in direct competition to HD-DVD (also Phillips), and I'm sure losing Beta max market to VHS was still in their minds. So they decided to sell PS3 consoles at a much greater loss. It did pay off as well, but the first few years cost Sony a lot.
Ethernet includes address and source identifiers so that multiple sources and destinations can be used on the same wire. As the sibling comment mentioned, hubs (not switches) simply broadcasted every frame to every connection, and the NICs at each end would have to determine if they were the intended destination. Some of them didn't even require power because they were essentially joining the wires together. Switches require power because they learn which destination the ports respond to and can route more efficiently based on that knowledge.
I've had three phones' (two Nexus 5Xs, one Samsung S21) USB-C ports fail on me unexpectedly, and without wireless charging suddenly I can't charge the phone. I'm unlikely to buy a phone without wireless charging for that reason.
In a similar vein, my Nexus 7's microUSB port died and I continued using it for years with wireless charging. Not really a common feature on tablets anymore
Yikes. I've had multiple laptop and phone power connectors flake out gradually but never so far suddenly. I just got my first USB-C phone because my old micro USB one was crapping out among other things. I'm gradually migrating from the old phone to the new. That would be much more hassle if the old phone was totally unable to charge. Now I'm worrying about USB-C, ugh.
I've only experienced gradual failures so far, which give me time to fix or migrate. Sudden failure is harder to deal with. Wireless charging plus replaceable batteries would be a perfect combo.
For JavaScript there's been a sharp increase in the amount of tooling that is written (or rewritten) in Rust: Turbopack, SWC, Vite's upcoming rewrite of roll-up. It doesn't seem like it is competing with JavaScript for writing web applications, but the tooling is definitely seeing more adoption of Rust.
In general Rust is great for writing developer tooling. You need low latency, high throughput, low memory footprint while being stable enough that it doesn’t crash much. Projects like Turbopack, Ruff (Python linter), Biome have all had some success and may continue to.
I was at the React Rally conference where Falcon was publcly announced in August of 2015. I recall that Facebook gave a GraphQL presentation right before.
It seems GraphQL was first announced publicly in February 2015.
>The CEO directly created the perverse incentive: actually hitting your goals only meant more work.
You say this as if employees are absolved of any responsibility to the performance of the company. You're not clever for saying, "Oh, 100% means we weren't ambitious? Then we'll work 80% as hard." In fact, I think you'd have to be pretty dense to not understand the intention of the KPI: set lofty goals and try to get there.
If you don't think it's working, why not go back with ideas to improve it, rather than doing less work? I mean, this company literally failed (from the sounds of it) and people here are blaming the CEOs KPIs versus the employees who intentionally work less to game the system. Strange.
As the saying goes: if everyone around you is an asshole, you're the asshole.
If you create a system which is abused by 100% of employees, that's on you. Everyone who works at a company does so for their own personal benefit. You can moralize about it if you want, but I doubt you would work for a company if it didn't provide an incentive structure of some kind.
> why not go back with ideas to improve it
I was a junior developer working for a 1500 person company. What should I do, demand an audience with the CEO, who lived in a different country?
I don't care if the company fails and there's no amount of emotional manipulation that can ever make me care. Only double-digit percentage ownership could make me care. Of course you can't get double-digit percentage ownership of a big company like Facebook, but in a company that big there's almost nothing I'm doing to move the needle anyways so there's no reason to be concerned, any shares are gambles.
I do care about my professional reputation. I'm highly skilled and understand how to improve social dynamics on a team, making me very valuable to have around. Clever leadership will figure out a way to align my skills with their own goals. But, if they don't, that's their problem.
> I don't care if the company fails and there's no amount of emotional manipulation that can ever make me care. Only double-digit percentage ownership could make me care.
That's the bottom line, isn't it? If the CEO wants to align employees' behavior and motivation with his own, he needs to structure the employees' compensation to be like his own compensation. If my compensation is just a "competitive" salary and a few token stonks, then I'm going to do a 9-5 job and hit the required 80% of my KPIs before I go home for the day. If my compensation consists of life-changing equity? I'll work days, nights, weekends, and holidays.
It's important not to confuse "I don't care if your company lives or dies" with "I'm going to be a bad worker." I'm an amazing worker.
As I said, I care about my professional reputation. When/if the company dies I want every single one of my former coworkers fighting to get me hired at wherever they land. You don't get former coworkers clamoring to get you in at their new shop by being a bad worker.
Companies want me to work for them, and former coworkers want me to work with them. I am just not emotionally invested in the companies who pay me a wage, I'm going to set reasonable boundaries on my time, and I'm generally going to respond rationally to incentives. Responding rationally to incentives includes considering if the extra work required to achieve a particular bonus is worth the effort, often it is not.
You don't need to give each individual employee double digit percentage (>10%) equity for it to be life changing. Let's say "life changing" for a normal worker-bee tech worker is ~$500K and the company is an average, run-of-the-mill $5B market cap company. We're talking about a 0.01% share for each employee. If you have 1000 employees, totaled up 10% represents life-changing equity for all 1000 of them. Hoping my math is not off by a factor of 10 or something.
As it turns out, crafting good incentives is the responsibility of management, it's a really hard problem, and CEOs tend to be reluctant to use the same method that boards use to incentivize them: a gigantic package of long-dated stock options. Maybe they think it doesn't work very well.
> think you'd have to be pretty dense to not understand the intention of the KPI: set lofty goals and try to get there.
So, hypothetically if I busted my ass for three months straight to hit 100%, just to be told "Oh, your goals were set too low, so here's more work", what incentive at that point do I have to work even harder? The new target's 80% was my last target's 100%, which I struggled to hit (because as you said, these goals are lofty). So if I bust my ass even more, do even more overtime, what do I get then? "Your goals were set too low, so here's more work." Do I just continue this until I burn out? Or do I hit 80 of my target so my workload doesn't spiral out of control, and I get exactly the same bonus?
> I mean, this company literally failed (from the sounds of it) and people here are blaming the CEOs KPIs versus the employees who intentionally work less to game the system. Strange.
Employees gaming their KPIs is a reaction to and a direct result of the CEO's "if you hit 100%, you didn't work hard enough" policy. CEOs using constantly moving targets and increasingly unrealistic expectations to extract every last drop of productivity out of their employees is met with an equally maximizing reaction: employees find a maximum amount of time and energy they can spend such that the status quo is preserved and the bar doesn't move as high next time.
I would bet KPIs were not the main reason that company went under. Obviously we don't have specifics, and companies rarely fail for one reason. I didn't blame the CEO for the company going under either - I simply pointed out the perverse incentive.
If you don't think it's working, why not go back with ideas to improve it, rather than doing less work?
Doesn't sound like "less work", it sounds like they did the amount of work management wanted them to do. Just like working 40 hours a week is "less work" than 60 hours a week, but I'm not going to work 60 hour weeks just because of some sentimentality around the company being successful.
If a boss gives me a list of five things to do and tells me if I do 4 of them that's perfect, 5 and I didn't set goals properly , I'm going to do 4. Particularly if my compensation depends on only doing 4.
It boils down to that ridiculous "stretch goals shouldn't be achievable because you should be shooting for the unattainable". If you want to have that mindset, don't beat the exployees for achieving everything. Beat yourself for not curating their goals.
Tracking and measuring KPI is good, without it the company won't know it's current performance and how to fix / improve it (though beware on burnout). However when it's become target with financial incentive, it ceases to be good measure, because everyone will try to optimize things to reach the target.
Velocity (work complete / planned * 100%) should always be a measurement, because it can help to identify errors in workflow / team member, and prevent demotivation when they need to perform task outside their goal.
> If you don't think it's working, why not go back with ideas to improve it, rather than doing less work
the description give didn't make me think it's the kind of workplace where the CEO is going around asking people, "what can I fundamentally change about our goal setting process?"
in any case, goals should be achievable, so if you punish someone for achieving them, you messed up
Yes and no. There's another view where employees set actually ambitious goals and only get 90% of the way there and are still rewarded and the company also wins.
If you are hitting your goals every quarter, that is a great thing - but I tend to agree, at the other end of the spectrum you have people setting easily achieved goals to check a box.
The real issue is weeding out people just there to check a box when you're small, and when you're large, it's designing a system that internalizes a lot of your employees are there to check a box.
the vast majority of workers do not internalize empathy for their company. They are there for the paycheck. You can either thrash against the entirety of the human race and demand that people lie to your face about how much they care about the company's goals, or you can create a system which takes advantage of the self-serving nature of your employees so that their goals align with your own
For years I was the one who went above and beyond. I thought of brand new ideas, approaches, and wrote so much code some weeks would just fly by.
You know what happened?
1. I got even more work. So much more that I no longer had the freedom to explore and think.
2. The carrot on the stick would be re-attached to a longer stick. It was never enough. I was performing "exceptionally" but my yearly review would never move me closer to my goal.
3. I would end up getting tied down in a bunch of stuff from (1) that ended up leading to burnout because I was no longer working in a place where I felt useful and needed.
4. I still got laid off.
Now, I just go to work and come home. I am pissed I even have to do pagerduty because 9/10 times the problem is someone wasn't given enough time, creating a bug, which was then subsequently never addressed because aGiLe methodology says you only move forward.
"High power" CEOs and PMP-certified project morons are the reason why people's care for your product ends with their paycheck. No amount of "demo days", "email updates", or metrics will fix it. I, and everyone who is like me, will game your metrics until they stop being useful. It's not malicious. It's an optimization. If you want me to do exactly what your metrics ask I will. Nothing more, nothing less. You pay me for 80 hours a check, you get 80 hours. Nothing more, nothing less. If you don't want me to think I won't. Afterall, I'd rather save my brain cycles for things I enjoy. You're paying me to squander my talent. That's YOUR problem. Not mine.
So, you get another job. With great odds of increasing your salary more than you could at your current position. There's really few downsides for workers in having this attitude.
> Well the paycheck ends when the company folds. "Doing good work" is in the best interest of all parties.
Option A is being an average worker, not trying especially hard, not thinking about work outside of work hours, and having absolutely no emotional investment whatsoever in the wellbeing of the company - and potentially having to get a new job in a few years time, often with a pay increase
Option B is "doing good work", "going the extra mile", "being a rock star", putting in lots extra time and effort and emotional energy (for years!) - just on the tiny chance that your particular efforts will be the difference between the company folding vs. being successful
Option A sounds a hell of a lot better to me than option B.
There are more companies. I also tend to abandon ship before the company folds. I left before the 100mil loss and subsequent layoff.
Additionally, at such a large company (1500 people), my individual efforts have little to no effect on whether or not a multinational corporation goes under.
This is the "I swear communism works, you just need people who actually care about worker life quality and aren't tyrants to lead the revolution" school of org theory.
I mean, that's also probably true. Best form of government is a benevolent dictator and all that...
More that you have to have realistic expectations when you structure an org, that people doing the work are going to pad and sandbag in successive tiers up, and people in charge of outcomes are going to push for bigger and more down.
Interestingly, individual firms behave more as centrally-planned economies, and market activity is mostly absent inside of them. This is a long-studied curiosity: https://en.wikipedia.org/wiki/Theory_of_the_firm
The book _The People's Republic of Walmart_ by Leigh Phillips <sp?> and Michael Rozowski <sp?> makes a compelling argument why employing the same technologies companies use to manage supply chains and decision making on a state level could actually produce a working communist system.
Of course, we first would need to agree that this is desirable and that's probably the point where it breaks down.
> [Feral pigs] are invasive and cause millions of dollars in agricultural damage each year, rooting and trampling through a wide variety of crops. They prey on everything from rodents, to deer, to endangered loggerhead sea turtles, threatening to reduce the diversity of native species. They disrupt habitats. They damage archaeological sites. They are capable of transmitting diseases to domestic animals and humans.
https://www.smithsonianmag.com/smart-news/feral-pigs-are-inv...
> Interestingly, in their native range in Europe and Asia, pigs do not cause as much ecological damage. In fact, some studies indicate that they may modify habitat in important ways for species that have evolved with them, such as frogs and salamanders.
https://theconversation.com/feral-pigs-harm-wildlife-and-bio...