Then what happens is somebody making $100+/hour would spend several hours to do the job that would be covered by a tool that costs under $50/month.
It's never that simple, though, is it? Suppose that tool solves a problem that a $100/hour employee would have solved manually in a day. How long did it take the employee to identify and choose the tool, arrange the purchase, and then learn to solve the problem using it? Probably a few hours too. So that $50/month tool had better be useful for replacing that job several times in a year or it's unlikely to be a net win in terms of time and money. It's certainly possible that good tools are far better than that cost/benefit ratio and using them is easily justified, but I'm guessing that in these organisations with hundreds of different SaaS subscriptions only a few of them are in that class.
Of course the elephant in the room is that there used to be another alternative, which was buying a tool outright for say $1,000 and then using it indefinitely, which would be a financial win compared to an equivalent $50/month SaaS tool in well under two years. Yes, there are factors like CAPEX vs. OPEX to consider so this isn't so simple either, but ultimately paying much more money for the same thing is still paying much more money, however you slice it.
The other gain you get from having the employee build the tool is that now the employee is better at building these tools, so you fundamentally have a better employee than the one that had to spend the time learning the non-transferable one-off tool.
On the other side of that equation, what you would hope is that your $50/month is buying you a subscription to steadily increased functionality over time. That's something your $1000 up-front payment wouldn't get you, and it helps offset the initial on-boarding costs. It's also probably true that once your one employee has gone through the selection, learning, and purchasing process, getting subsequent users on-boarded is much quicker, because there's someone sat next to them to say "just do it this way".
On the other side of that equation, what you would hope is that your $50/month is buying you a subscription to steadily increased functionality over time.
Some people might hope for that. Personally, if I'm buying a tool for professional use, I want to choose my preferred one and then have it be reliable and future-proof. Things like security updates are one thing, but the last thing I want is random changes in functionality or UI being forced on me.
This worked just fine in the traditional model where you bought a version of some software, and then if there was another version released later with something more that you wanted and adequate compatibility, you bought the upgrade too. The user gets stability and new developments if they want them. The developer gets paid for new developments, though only if they are actually valuable to users. No-one gets forced into anything unexpected changing, or going missing, or breaking compatibility.
Tragically, the whole SaaS, insta-deploy anything we feel like culture that has evolved in recent years has utterly destroyed that stability and reliability. I regard this as possibly the biggest retrograde step in the history of personal computing (and that "possibly" is mostly because the walled garden culture that has also become so powerful in recent years might be a larger backward step).
I don't disagree: there's a reason I prefer Debian Stable on my servers. But when done well, and in the right place, it can be a worthwhile avenue. Github is a pretty good example: they've not stood still (ok, they also haven't moved that fast either), but the core of what worked 10 years ago still works today.
It's easy to say "yes" and difficult to say "no". Becoming a financial steward comes by first saying "no" and only saying "yes" when the ROI worksheet is filled out, completed, and approved. If you want your manager to approve spending some money on tools you can help them by filling out your own preliminary ROI worksheet to show them why you think it's a no-brainer.
Buying a $5/mo subscription by accident, forgetting about it for years, and getting 0 value in return is cheaper than training on a formal purchase system, filling out a ROI sheet, and ping-ponging it back and forth in a few meetings before "responsibly" deciding not to buy.
Giving individual contributors purchase authority and then supervising spend -- the model implemented in every major cloud -- is a much better compromise, because it lets management prioritize where they spend ROI calculation cycles which are not free. Very not free!
What if that SAAS is data mining you though and your company secrets are leaking out? Or claiming some sort of ownership over content you post within it? I don’t know anyone who reads the privacy policies and TOS of those apps but tech companies just let their teams use what they like.
It’s hard to find a good SAAS privacy policy actually, and they target businesses who are theoretically more “serious” than individuals
2. Enter cost per engineer of product (subscription, one time, etc.).
3. Enter required minutes to make worth your time (setup per employee can be factored in).
This may already exist, but if it doesn't it could be a useful tool for Saas companies. Something you could drop into emails to potential customers ("see how much you would save").
I have a spreadsheet at work that does this for manual tasks. It calculates minutes to perform the task X # employees X yearly frequency of task.
Once I've plugged in the values, I go back and ask if we really want to spend 30 hours a year updating intranet employee profiles or if we should use that time to fix some bugs or code a new feature.
Its quite effective to pushing back on unneeded tasks.
I don't think the problem is people at the executive level. It's that all kinds of teams across the org are paying for Saas products and it adds up over time. The other problem is that 15 teams might be paying for small group Slack (insert Saas here) licenses when they should be on a single corporate plan. Ironically I believe there is a startup that looks at accounting and located duplicate licenses.
Hard to find for sure. There was a time though when we didn't pay CEO's & professional managers 500x as much as the lowest paid employee. When we did that we still managed to go to the moon, split the atom, develop computers, create antibiotics, and lots of other neat things.
Do you have evidence for your assertion? Of what significance is the ratio of highest to lowest paid employee? Shareholders of a particular company pay CEOs, not whoever "we" is.
Producing more value than you are compensated for ... hmmm what does that sound like? You could say it sounds like the human condition. Which employees in the pyramid do you think have the worst scalar here?
This may be true for CEO’s turning dysfunctional companies around on personal authority, but I find it hard to believe for companies that are already running perfectly well.
I have yet to encounter a manager who motivates me to get up and find a way to eat that day.
Just because they are there doesn’t mean the achievements are due to them.
There have been a number of attempts, tens of millions of dollars spent on one study alone, trying to quantify what technologies and people generate value.
The meta-conclusion considering all the studies I know of (traveling and not looking them up), is the math becomes so Byzantine it’s a pointless measure and we should just stick with ideology to avoid blowing up society in people’s minds (an idea that was peddled for thousands of years already, reaffirmed by math, glad we spent the time on it.)
Just because we’ve emotionally conditioned ourselves to engage our mechanical agency towards these ends does not imply it’s because of management.
Hm. Allocating credit for achievements is a tricky thing.
Management is necessary to allocate resources (including labor) efficiently at scale, but it comes with overhead.
As such, at smaller companies, startups, small businesses, the overhead of management can be outsized, the pain felt acutely when compared relatively to the actual output of ICs and such.
At larger companies, the massive cadre of management which is necessary to keep the machine running is also very obvious, as the overhead of a large organization is very large, even if the management itself is excellent and highly efficient.
As such... How does one allocate credit? An analogy would be like financing or investment. Necessary. A filtering mechanism. Sometimes helpful in an advisory mechanism. But are they the ones creating output? No. But do they deserve some credit? If something cannot be done without it, then they must deserve SOME credit, it's just a question of how much.
Why does society have to optimize at scale for assigning credit to a minority? If anything the studies show value is emotionally subjective.
A healthy society takes a village. IMO there’s plenty of literal history to show isolating a minority from the demands the rest of us face is ripe for abuse at scale.
One person did not invent languages, lay down the highways and invent computers.
All the people we hold up relied not just on the historical invention to push them forward but society giving them space and not killing them. Every individual inventor is outnumbered.
IMO that space to be and do is what we should optimize for. Not a tether to tradition of emotionally wanking off a handful over what is ultimately a linguistic twist on an idea that was discovered/defined collectively
Valve is, by all accounts, a toxic cesspit that's ruled by various influential cabals. They still have a hierarchy, they just aren't explicit about it.
At least for Medium, "Holacracy" pretty much imploded. Zsppos switched from it to an internal marketplace - which is, oddly, controlled by excecutives.
Flat organizations beyond a certain size either move away from flatness officially, create unacknowledged hierarchy, or fail. I'm really not aware of a single large-scale example that still works.
This is much different for small orgs - there's something about org size that requires some sort of centralization. (I think it's communications overhead + complexity exceeding the limit of what a single person can keep in their mind. I'm not 100% convinced, but these are the pressure points I usually see)
The video game sector is unstable and toxic. IMO, Valve compares favorably to other video game companies as for one thing it’s still in business.
Flat organizations are unusual, but survive about as well as other similar organizations. The difference is when a giant reorganization/buyout etc hits, they more obviously change into something else.
“without” is the wrong word: you need managers but they need to have incentives which align with the actual business and accountability. Most companies have problems with at least one of the two, which is a loss but usually indirect enough that it's easy to excuse or deflect.
You mean the companies where executives use extensive corporate speak and pull deadlines out of their asses, without checking in with the experts on the ground, triggering office heroics down in the trenches. We'll do it - we all get rent to pay.
Yeah, you can fake through it more than half of the time, comfortably. Just be eloquent, tall, and loud.
Has Valve done anything noteworthy in the last 10 years besides milking the Steam store? Their half-baked console seemed like an attempt to do more of the former.
I know you are trying to be snarky but Valve has accomplished a hell of a lot more than most startups ever will. They are far from perfect (hell, they’re known for bugs and their famously poor timelines, and I heard their game Artifact was not good,) but it’s pretty bizarre to suggest they do nothing. Maintaining Steam alone is huge. It’s hard to find a platform with more vigorous fanboys for good reason.
But that all aside... I mean they have a new Half Life game on the near term horizon (later this month apparently) and they have done a ton of work on VR and Linux support.
Summary: the game was brilliant ("uniquely amazing") if you were really good at it but was essentially impossible to get into, and its economy was badly constructed.
Yes, it is. Keeping Steam growing and well-liked is an achievement in itself. They never really stood still with it.
Steam certainly challenged the status quo when it was released, and several times in its life.
Hell, who else is working on Linux gaming right now? They were relatively early on game streaming, VR. Steam Workshop? How cool is it that a large amount of one of their flagship games, Team Fortress 2, not only integrated a lot of third party content, but also paid creators back? How cool is it that many games like Counter Strike began as Half Life mods?
I don’t really use Steam much anymore, so it’s not that I’m personally attached much, but I will admit to being pretty impressed.
Is this sarcasm? Steam has been the only consistent video game Cloud SAAS in the past 10 years.Can you name one time that the service was down without warning/proper recourse, had a data breach, exceeded SLAs, etc?
Also, valve's ther ventures are....massively successful. The microtransaction model in gaming originated with Valve, and now the entire mobile and freemium gaming markets use that business model.
So in this case...I would say yes, everything valve has done as a company trying to make money in the last 10 years is par for other successful SAAS platforms.
The title of this thread is "Companies fret as costs soar for software subscriptions", and this comment thread delved into cloud SAAS providers with a flat org structure.
Steam/valve was brought up because they have a successful cloud SASS model with no subscriptions oand a flat org structure.
Xbox live has a subscription fee since day 1, is maintained by MS (the definition of "non-flat organizational structure"), and is vendor locked. Not even remotely comparable.
We got well over $80k into manpower on optimizing our self-hosted app for resources so that we didn't have to tell less than a dozen big customers to upgrade their hardware before people stopped laughing right in my face for suggesting we just buy them some new rackmount equipment as a gift...
That project died the following year, and I believe the opportunity costs of that work were substantially responsible for the demise.
Lol, I do this. My customers would often try to run my software on dusty old servers and then complain that it's slow, so now I just send them the hardware for free.
Different team, we did a cluster install and they were getting 10% of our advertised transaction rate. Something was clearly off. I commented “did they plug the whole thing into a 10Mb hub?”
They had. We sent someone eight timezones away to find that out.
>Then what happens is somebody making $100+/hour would spend several hours to do the job that would be covered by a tool that costs under $50/month.
Sure, but then there's the slightly more complicated economic calculus: do we build our own tooling now, for a significant up front cost, or do we buy the SaaS which we might use essentially forever for an unknown future TCO?
How often do you build in house tooling or applications that have no ongoing operating commitments?
It's upfront capital investment plus generally unpredictable maintenance vs. More predictable but maybe higher operating expenses
I don't know if more predictable is all that accurate. SaaS providers change functionality and UX regularly and that often breaks workflows. Then you have the ones that shut down or get bought out and then substantially change their offerings. Then there are those services that would have been private to an intranet, but are now public, and they get compromised.
I don't think it's clear-cut. Some SaaS has worked out remarkably well for me and others I wish I had just built the thing in-house. Also, few providers give a way to actually get your data back out in a usable fashion, so you tend to get locked in without substantial cost to back out.
Nothing complicated about it. You start with SaaS, validate it's a good tool to have/actually used, evaluate monthly costs vs opportunity costs, build your own when/if it makes sense.
The slightly more complicated question is "do we invest in optimizing SaaS costs/utilization? Or use those resources to start building our own?". But even for this question the right answer is usually fairly obvious.
If it was so easy, why are the companies in the article "fretting"? Because putting values on "good", "opportunity" and "makes sense" (not even counting the fact you have no control over the SaaS raising prices or going out of business in the future) is harder than we can spend $400/month or $50/month.
It is complicated at scale and over time though. Look at Saas infrastructure: How you do it as you grow constantly changes the results of any evaluation.
"spending another $10K to make them even 10% more productive is a bargain."
$200K is not a good benchmark for what companies spend.
Median salary in the US is about $50K [1]
Average overhead per/employee is 18-26% [2]
So a $10K spend for 10% increase in productivity on someone earning about $60K ... is not a bargain.
But there's a problem with this, because SaaS can be
1) Individual productivity
2) Operational productivity
3) Product spending (i.e. like GMaps integration as a feature in your product)
So 1/2 and 3 represent a very different type of calculus to the point wherein that kind of spending should definitely fall into different buckets for accounting.
The parent of the parent comment clearly referenced engineering with his “If you’re already spending >$200k/year per employee then a productivity increase of 10%...”
> And the productivity increase using certain SaaS tools is not 10%, it can be an order of magnitude in certain instances
well... my productivity isn't always increased, regardless of what tools someone wants to buy. and... sometimes it's a drain on my productivity, but using tool ABC increases someone else's productivity (at the expense of mine). How do you account for the productivity increase of -5% for 30% of your employees, but 20% for the other 70%?
- Process mismatch / impedance : The software is unlikely to be able to do exactly what you want how you want. The human likely is able
- Vendor lock in: If the SaaS decides to change how it works, or the prices you now have a leaving cost
- Expenditure on someone else's competitive advantage: If you use tool X likely there is no reason your competitor cannot. You both use it, no net advantage gained over competitor. If you develop the expertise in house (likely the harder thing) then you have an advantage over your competitor. Of course this has to be balanced with what will be your competitive advantage and what will not
- Harder to manipulate: You cannot fire 1 of your SaaS and tell the other 2 to do the work
There's plenty of situations in business where using it is an incredibly good idea. There's even more situations where you're better off just paying for a service.
Nobody truly believes "FOSS is free"... sure there's cost, you just gotta figure out what the cost is for you! (that's the trick a "free" product has a unique price for each "buyer", here in lies the beauty and the opportunity of the game). There's no "market price" so it's tricky, but when you can compute the cost and it happens to be very low for your team (and preferably high for competitor's teams), you have a competitive advantages.
OSS is all about leveraging "hidden" advantages agains competitors that have clear and obvious financial advantages over you. If you get it right, you win!
Of course, if you don't like risks and don't like "uncomputable" prices, or "prices unique to each buyer", then sure, be conservative, and stay away from FOSS.
1. You can, and it makes sense for FOSS projects with real support ecosystems, like RedHat, or for freemium FOSS projects, that explicitly sell support.
2. You can still be locked into a single-supplier when you pay for third-party FOSS support. In fact, most third-party FOSS support comes from a single supplier.
3. FOSS that doesn't have a sustainable business model attached to it rarely serves the needs of enterprises. There's FOSS alternatives to Sharepoint, but what's your FOSS alternative to Tableau? Salesforce? Slack? Will you actually save money by using it, compared to just paying a vendor?
> There's FOSS alternatives to Sharepoint, but what's your FOSS alternative to Tableau? Salesforce? Slack?
From a purely technical POV, I'm pretty sure that feasible alternatives exist to all of these, e.g. I don't think Tableau's data viz product does anything that couldn't also be done with relative ease in R. Of course, these commercial offerings extend way beyond the purely technical domain where FLOSS makes the most sense; but that has little to do with "sustainable business models" - R itself is plenty sustainable on its own - and a lot to do with extremely niche or obscure "needs of enterprises" that FLOSS projects are either not clearly aware of, or not very interested in supplying. Niche and obscure problem domains have always been problematic for the open source model, this is nothing new.
This is reductive. If "10% more productive" translated into >= 10% more revenue as a percentage of their wage, it would make sense but it varies so wildly, that it's not a useful statement.
Some will translate to increased revenue, a few times over, (which incentivizes the vendor to increase the price accordingly) and some will be net negative.
Depends on the context, I guess. I'm generally more comfortable with lightweight things - like GitHub projects for software and Trello for everything else.
The person who convinced me to [pay for a personal license for] Jetbrains instead of torturing myself with Eclipse just passed on the questions that had convinced him to do it:
How much does it cost you for Jetbrains? How many technical books is that? Is in worth that many books a year for a decent tool?
Include health insurance, retirement match, employer side payroll taxes, equipment, etc. I don’t make nearly that much but I easily cost my company more than that.
They are just going to pay their employees less. Companies have to be profitable, temporary exceptions notwithstanding, and the money has to come from somewhere.
As long as the purchased software is worthwhile (i.e. it makes employees more effective; the company gets more output per employee expense dollar), this trend should actually increase employee pay. Higher output per employee allows a higher salary per employee while still being profitable.
> many people in USA believe that corporation has duty to increase shareholder value. And similar stupid ideas.
No more stupid than the idea that companies are supposed to exist solely for the benefit of the employee. Shareholders are the ones that put the money up to start and grow the company in the first place, so yes, there is an obligation to shareholder value.
Don't like that? Then you are free to start a company that doesn't take investment.
Shareholders are only able to make contracts because the government uses guns to arrest people who break contracts. Don't like it? Move to an offshore oil rig, I guess.
Shareholders take a risk in their investment, mitigated by making it more fluid as they can easily trade the shares compared to more traditional investments.
There's however no law that holds a figurative gun to the heads of corporation that their duty is to increase the value of those shares.
Not sure if you are being facetious, but that's in fact the law. The board is a fiduciary of shareholder (investor) money. Is that a great idea for society? Not sure, until it changes we live in the world we live in.
And on the productivity point, that makes a good theoretical case but it is just not demonstrated by historical data. In the US, in the past 50 years, it even has a name (the pay-productivity gap or the wage gap). You can look at productivity vs wages since 1970 and see how they have not correlated. Further, you can go into income and wealth inquality growth over the same time (https://en.wikipedia.org/wiki/Income_inequality_in_the_Unite...) and see how the wealth generated by the productivity gains has been highly concentrated in the top 0.1% and 0.01% wealthiest. Some people will talk about real or nominal wages (adjustments with something called the Implicit Price Deflator) that show wages purchasing power increasing, but I think one can look at costs of home, college, and medical care and other critical things in social mobility that have increased in cost far exceeding inflation as complications to that idea.
Fiduciary duty means that the board is supposed to be honest with the shareholders and not actively work against the corporation. That does not mean, in fact, that they are required to put growth of shareholder value to any respect.
> Dodge v. Ford Motor Company, 204 Mich. 459, 170 N.W. 668 (Mich. 1919) is a case in which the Michigan Supreme Court held that Henry Ford had to operate the Ford Motor Company in the interests of its shareholders, rather than in a charitable manner for the benefit of his employees or customers. It is often cited as affirming the principle of "shareholder primacy" in corporate America. At the same time, the case affirmed the business judgment rule, leaving Ford an extremely wide latitude about how to run the company.
[...]
> In the 1950s and 1960s, states rejected Dodge repeatedly, in cases including AP Smith Manufacturing Co v. Barlow or Shlensky v. Wrigley. The general legal position today is that the business judgment that directors may exercise is expansive. Management decisions will not be challenged where one can point to any rational link to benefiting the corporation as a whole.
The caveat is important. If a company says, "Customers don't like it when rich people in suits treat <x> like shit. Therefore, if we spend a small amount of money treating <x> better, this will make our customers happy, and they will buy more of our products, recouping the cost and bringing in more profit for our shareholders." Substitute the environment, animals, customers, employees, people in Africa, cancer patients, the children, etc. It doesn't matter if you're completely full of shit, you just have to make the argument.
So while you're correct that companies have a primary requirement to increase profits for their shareholders, in practice, this requirement is incredibly loose, to the point of not actually being a requirement at all.
Doesn't higher output per an employee lead to a change for demand of that type of employee, lowering pay?
It always seemed to me the value an employee brings is the upper bound on pay (demand drops near 0 over that), but the pay is determined by supply and demand for that labor.
The fully loaded cost of an employee for a business is not the employee's salary.
"The fully-loaded costs of employees are much higher than their salary: exactly how much higher depends on your locality’s laws, your benefits package, and a bunch of other HR administrivia, but a reasonable guesstimate is between 150% and 200% of their salary." -https://www.kalzumeus.com/2012/01/23/salary-negotiation/
I don't know much about jobs outside of tech, but I know most of them at the entry level to a few years of experience in accounting, finance, real estate, etc. do not pay much over $60k (if that) and pretty much all of those jobs are going to use things like Salesforce or other expensive SAAS tools and software.
Salesforce "Enterprise" is $1800/user/year [0], and presumably large enterprises are negotiating better rates than the published ones. Every single employee would need to be using five and a half packages like this at full list price to hit $10k/year.
In Sales you would have SFDC plus some kind of enablement tool like Salesloft/Outreach plus a prospecting tool like Sales Navigator or DiscoverOrg plus enterprise Gmail plus a communication tool like Zoom; and then there's the analytics stuff, the HR/finance parts, etc...10k is probably way too high an estimate but 5k is believable for a B2B startup with a scaling sales team.
I won't quibble too much with the precision of the $10k number, which will vary widely.
But it's pretty obvious that a basic "customer support" combo like Office 365 + Okta + Zendesk + Intercom + Slack + Zenefits starts to add up to a meaningful percentage of the employee's take-home compensation. (Remember these are positions where $20/hr is a good rate.) And remember, the comparison figure for managers over 50 is roughly $0, which approximates the spend on these functions as far back ago as 2008.
In London, I'd guess the median salary for most enterprises is around £30-40K. National insurance, a desk, and a few other benefits are probably £20K on top of that.
Fully loaded costs (health benefits, 401k matching, cost of computer, HR business partners, etc.) for an employee are roughly 2x what their salary is. So yeah, 200k is a good estimate.
That's still very high. Average salary is much lower than $100k in the US. Even among college grads with 10+ years of experience, $100k is still a high-end job paying a high-end salary. It's nowhere near the average.