They fired the head of the BLS because they didn't like the numbers. Explicitly. You kinda have to conclude they're going to cook the books. Like they're doing everywhere else.
That interview was unfortunately timed (taped before nomination, broadcast afterward). My takeaway was that they are not trying "just not report the numbers" _yet_.
It's not universal but B2B sales in particular have evolved to incentive/performance compensation to a large degree. Wasn't always the case but (most?) of those companies aren't in business any longer. Also extends to the sales hiring process. Not that companies don't look at track records but it's also the case that sales managers don't have any issue firing people who don't meet their numbers.
Given he's engaged in mass murder that's severely destabilizing a significant chunk of the planet (through his unconstitutional culling of USAID among other things), it doesn't seem unreasonable to attribute his behavior to malice.
AI as a coordination multiplier would be interesting in large orgs — the AI assistant that trains on internal newsletters & minutes of all-hands says "I think you should loop John Doe from team X into this discussion because 1 year ago he ran point on something similar"
> I think you should loop John Doe from team X into this discussion
yeah, that's a useful thing that a chatbot could do...in theory.
in practice, from the recent CMU study [0] of how actual LLMs perform on real-world tasks like this:
> For example, during the execution of one task, the agent cannot find the right person to ask questions on RocketChat. As a result, it then decides to create a shortcut solution by renaming another user to the name of the intended user.
This just moves the coordination costs elsewhere, because more companies = more software = more hops to get things done.
Now, instead of Employee A and B working together to solve Problem X, Company A's product and Company B's product must be used together to solve Problem X. At least the employees know each other and are in the same "white box". But software products are a blackbox, so the end result is almost certainly worse.
True, but the large companies are incentivised to not see or accept that. I really don't think Jassy is thinking that he wants Amazon to be smaller so it has lower coordination costs. It will also have a smaller market cap, you know?
I'm on the board or a board observer for a couple companies (some public, some startups), and it is a bit of column A and a bit of column B.
The headcount growth during COVID along with the return of offshoring with GCCs was driven by the intention to speed up delivery of products and initiatives.
There are some IR games being played, but the productivity gains are real - where you may have recruited a new grad or non-trad candidates, now you can reduce hiring significantly and overindex on hiring and better compensating more experienced new hires.
Roles, expectations, and responsibilities are also increasingly getting merged - PMs are expected to have the capabilities of junior PMMs, SEs, UX Designers, and Program Managers; and EMs and Principal Engineers are increasingly expected to have the capabilities of junior UX Designers, Program Managers, and PMs. This was already happening before ChatGPT (eg. The Amazon PM paradigm) but it's getting turbocharged now that just about every company has licenses for Cursor, Copilot, Glean Enterprise, and other similar tools.
We don't do a lot of the essential manufacturing for those GPUs.
And AGI automates the US competitive advantage (white collar work). Plus we're gutting our universities and national science funding so we're losing that anyways.
reply