There was never really an app economy in the sense of a thriving economy for apps on the app store. It's always been a small fraction of apps that made it and the rest not making any money. They could however go on for ever because the cost of keeping them up is close to zero so it always looked like it was way more successful and lucrative than it really was.
There is a much more successful app economy that's been around since the inception of the personal computer which is doing ok. Some are SAAS apps others are little tool apps like my own, even others are open source apps and yet others are shareware.
In other words. As long as you don't define app economy only as what happens on the mac app store then the app economy is doing ok and we haven't seen the end of it.
However if you only think in terms of apple app store then I would say it's never been there to begin with.
There are some larger trends I believe that renders the idea of apps as products useless but that's for another topic.
Not the OP, but I've been developing apps for years and agree with the sentiment. A few thoughts:
There are, in my mind, 4 broad categories of apps:
- 1: Games with microtransactions, think Clash of Clans, Candy Crush, Pokemon Go, etc.
- 2: Games without microtransactions (sold full-price), I honestly can't name any right now.
- 3: Apps tied to some kind of service not exclusively on the phone, think Uber, Grubhub, Amazon, Gmail, Facebook. These are apps who are
- 4: Apps that primarily provide value only on the phone, sold for money. Think Dark Sky... and not many others that are still alive.
Categories #1 and #3 are alive and well, and I think will continue to be for the foreseeable future.
Categories #2 and #4 are dead. The article says it's the "beginning of the end", but I'd argue it's very close to the end of the end.
The trend I've been seeing is that apps in category #4 are not only failing commercially, but also increasingly being pulled into the greater ecosystems of apps in Category #3.
Instead of buying a standalone flight-tracking app, it's now integrated into a travel company's app - think Expedia or Kayak.
Instead of buying a standalone weather app, it's now table stakes in either the OS itself, or integrated into larger apps like Google.
Instead of buying a standalone package/delivery tracking app, it's now integrated into a retail app, like Amazon or Wal-Mart.
etc etc, that's the trend I'm seeing. Apps-sold-as-standalone-products is generally speaking dead, their ultimate fate is that their feature sets are absorbed into apps/services that make Actual Money(tm).
Perhaps dead, but #2 are the ones that I look for. I prefer to pay full price and get a full experience than expend .99 cents so that my little one is able to bake another batch of virtual cakes, then buy again in a week, then....
Agreed - I actually play little/no games on my phone/tablet because of the current state of affairs. It's nearly impossible to find a game that isn't just an abstraction of a slot machine.
Square/Enix has done well with some puzzle games - Tomb Raider, Hitman, and Deus Ex all have turn-based puzzle games that are fun and priced on level packs rather than some bottomless pit of microcurrency.
But they are the exception.
I think this is representative of a larger issue in gaming in general - development costs have grown dramatically faster than sales, to the point where the economics of game development have become marginal. Despite blockbuster marketing efforts and millions of copies sold, many games have trouble recouping their costs, not just in mobile-land, but also in traditional strongholds like console games.
Your basic conclusion -- apps as standalone products paid for up-front are dead -- doesn't seem too surprising if we consider that apps have trained people from day one to think software costs almost nothing. If you're selling anything with substantial development costs to a market that behaves as if $2 is a lot of money for something on their phone, the only commercially viable options left are to attract a high volume of customers making lots of tiny payments over time or not to expect your app to be a significant source of direct revenue in the first place.
It's a similar phenomenon as the video game crash of the early 80s. Companies flooded the market with hastily made shoddy games just to get a quick buck but it resulted in nobody wanting to buy games anymore because most of them were garbage.
Still waiting for that one to rebound. The vast majority of game developers seem to be just remaking old games with new "content" in tired old genres.
Oh another game where you bleep an angry creature til they bloop and you get a point. Great. Oh, I can upgrade my bleeper? Wow.
I think it's because the industry attracts people who are, for whatever reason, avoiding meatspace activities. They get immersed in game industry memes and then that's all they know. I see a game about farming and it's like... Do you even understand what farming is like or why it's fun? But they don't... They just re-make MineCraft with farming graphics because they don't actually go out into the world and, like, farm. Do stuff. Understand what the world is like. Because the people who like the world never left it, never escaped to video games.
At this point I don't even play games any more, I just watch YouTube reviews of them and spend my time designing games on my own.
I think it's a huge missed opportunity though, and when people realize it there will be a gold rush on games that are actually about something.
Rebound? There are more games made in a year with diverse mechanics than could possibly be played in a year. Itch.io, Kongregate, Steam, "IO games" like agar.io, game jams, VR games... There's a lot of shitty games being made but there's not a lack of good ones.
I am happy to find new places to look for good games... But I don't consider "game jams" a serious suggestion, because they're not finished games. I think lots of cool games get prototyped and never made, that doesn't contradict my point or solve my problem.
I have spent a lot of time looking through Steam and it has resulted in my opinion as stated. I have also looked at a lot of VR games and not seen anything particularly outside-the-box reach the level of complete game.
I hope you are right, but telling me "just look for them!" isn't helping. I've spent hundreds of hours looking. Can you give me more specific guidance? Or narrow your list of suggestions to the ones that you think will help me find finished games that aren't just rehashes of older, well trod game types?
I'm not saying games about something don't exist. I'm just saying they are incredibly rare and the entire game industry could disappear and it probably wouldn't affect the rate at which such games get made.
My take on this is that there are two factors contributing to this, and they neatly fall upon the boundaries of the two major app ecosystems. I think these issues are fixable, and I think big inroads have been made in the second.
Firstly, the Apple ecosystem, or the iOS ecosystem really - putting aside the Mac Appstore for now - is fraught with overhead and bureaucracy. Bootstrapping as a developer is relatively expensive given you need to have a modern mac of some sort and you have to pay a subscription which I think cuts off some of the hobbyists. Then once you're rolling, the amount of red tape involved in testing, releasing, updating etc really is shocking. I'm involved in a project where we support an App for both Android and iOS and really, 90% of the work on the Apple end is managing all this bureaucracy. This is massively expensive, and when asked for recommendations on another project I said "develop for Android first and then hire an agency to port it to iOS". The amount of messing involved really makes this a specialist competency like Solicitors' conveyancing or something and it really doesn't make sense having your innovation team waste their effort on it.
Secondly then is the lack of quality control in the Android ecosystem. Aside from a few established titles, the Play-store really is a crap-shoot in terms of whether you'll get a good app in a specialist area or somebody just chancing their arm. Like I say this has come along a good bit, but in my Android devices I often just skip on Apps entirely because I really can't be sure whether I'm installing something dodgy or not.
I do appreciate the irony that each ecosystem's strengths derive from the what I pose as its weaknesses i.e. bureaucracy => better quality app store; open submission process => lack of quality control. As both a producer and a consumer however, I am just highlighting what I see as the key issues that are limiting growth. It must be said I see ongoing improvements in the second area though, but increasing impediments in the first.
Then once you're rolling, the amount of red tape involved in testing, releasing, updating etc really is shocking.
I've been doing iOS dev for seven years, and have launched dozens of apps for organizations large and small, many of which had Android versions too.
I have no idea what you're talking about here.
Yes, you have to submit updates for approval, but it's quick and easy, and I can count on one hand the number of rejections I've ever gotten, all of which were easily fixed. 95% of my updates are approved in 1-2 days.
On the testing side, TestFlight and other tools make testing pretty quick and easy.
So what is the "bureaucracy" you're talking about?
The biggest issues with iOS are the 30% cut Apple takes, and discoverability on the store. The dev experience is actually pretty good, and the user experience is much better than Android.
Apple's balance here is FAR superior to Android's.
Well, like you say, you've been doing iOS dev for seven years, which kind of puts you into what I'd describe as the "Specialist" category. I wouldn't go trying to develop an iOS application without somebody of your calibre on board - but then you're either an expensive developer with cross-disciplinary skills or somebody we have "on retainer" to deal with this kind of stuff. Specifically what I recommended my colleague hire once she has her prototyping done, purely to deal with the machinations of the Apple ecosystem. Then just take it inhouse for support and maintenance.
It's interesting you mention test-flight specifically as this is something we've had a huge amount of trouble transitioning to. For an Apple specialist this is probably a no-brainer but for a product development org this is yet another headache where we used to just be able to issue our own development testing certificates.
It's specialised, but nothing ground breaking. The major issue that I had when starting 6 years ago was having issues with code signing, but those have largely been fixed in the latest versions of Xcode. I never had any of the issues you described.
Getting an app in the App Store isn't much harder than integrating with a third party API. Frankly, I don't see how they could make it much simpler. You make a build, upload it, it gets reviewed and you push it to the store.
Adhering to the rules of the ecosystem is the price you pay for getting access to >100 million accounts that can drop pretty much any amount on your software with the touch of a button. Sounds like a pretty good deal to me. I've managed to make plenty of money through the App Store.
I only have my third-party empirical experience to go on. Developing for other platforms seems far more straightforward. If this has changed recently then that is a good thing, but experiences in the meantime has left scars that might take a year or two to heal.
I had previously done Android development, and god, how I hated that. A gazillion phones with different screen resolutions and apps never looking exactly the way you want them to, random APIs breaking on random devices, because some shitty chinese manufacturer skimped on certain key things, etc etc. In comparison, iOS development seems pretty great to me.
Maybe you're right and I don't realize how complex it is, but I'm skeptical. Android seems worse in many ways, and I had no trouble starting iOS dev back in 2010. And it's gotten faster and easier since then, so I don't think it's just my amazing skills (though I'd like to take more credit).
What specifically is so onerous? Are you sure these aren't just organizational issues on your end?
Yes it is largely organisational in that I typically work in specialist technical fields where our manpower is directed elsewhere than dealing with specific platform issues. Point is though, that where you're talking about growth these are precisely the areas that native is losing out.
Yes, you have to submit updates for approval, but it's quick and easy, and I can count on one hand the number of rejections I've ever gotten, all of which were easily fixed. 95% of my updates are approved in 1-2 days.
I think the point is that if I'm a web developer and I want to update my web app, I can deploy in 1-2 minutes with 100% acceptance rate. The only fees I have to pay are for hosting, and everything else is totally under my control. I'm not subject to the caprice of whoever runs any official app store for some platform, nor to paying them exorbitant rents.
I did web dev for years and there are lots of trade-offs there too.
True enough, but having one organisation able to entirely dictate whether your app lives or dies is not one of them and never has been. That and the 30% cut were all it took to reduce our interest in iOS apps to near zero for any project I've ever worked on.
I think you're right about both the two problems and the irony of the two of them being almost opposites extremes.
With a web site, you can mostly avoid both though, as web apps have evolved a variety of models where someone can try it out before they make any kind of commitment. Don't like it? Close browser, done. Like it enough to look again later or ask friends? Bookmark and/or share. Like it enough to commit right now? Hit the sign up button and create your account, or follow some other process that the site offers for saving what you were doing more permanently. There are no red tape or royalties to worry about in publishing your site, and because of the lower commitment there's less of a worry about quality for users visiting for the first time.
One fundamental difficulty with the app store model is that while its uniformity may make it easy to use, it also offers little influence or control to app developers before someone hits that button to install. And as anyone who's worked on the web equivalent can testify, that's probably where the majority of the customer acquisition process takes place. You're pretty far into the funnel by the time someone is actually downloading your app to their device.
I'm not sure either emperor ever had much in the way of clothes. They sucked a lot of, dare I say, naive developers in because for the few apps that really did hit it out of the park the rewards could be astronomical. However, expecting to publish the next Angry Birds as an app is a bit like getting into web development and expecting to be the next Facebook. Sooner or later, someone will be, but many thousands more will have tried and failed in the meantime.
Ah I think going on 10 years ago native apps were a good bit ahead of web apps in terms of what you could do. Even currently native apps can have the edge in a number of areas, but the gap has closed. I think, for instance developing angry birds for web would be a little better off still in the native space. But the thing is, when you're talking about growth you're concerned not with these specialist areas, but the fringe areas where many are simply deciding web is more straightforward than native and is "good enough".
Fair enough, in the early days native apps had some big advantages. On iOS they still do to some extent, though that's mostly because Safari is such a limited browser and Apple won't let anyone else build something better on their platform. (<cynic>Gee, I wonder why.</cynic>)
As you say, though, the web just has to be good enough to rival native mobile apps, and for most purposes that's been the case for several years now.
The biggest thing Android's Play Store is terrible at (and I've heard the same charges leveled at the Apple's app store) is discovery.
In Android, here's my use case: (1) Identify a need I want fulfilled by an app, (2) Search in the Play Store for some terms, (3) Receive a results page that gives me no information to discriminate (dodgy reviews, some basic screenshots, people saying it's unstable under the newest Android, etc), (4) Go do a web search +"Android app", (5) Browse through hundreds of shill sites that doubtfully even downloaded any of the "5 Best Android Paint Apps!", (6) Give up and do it on my laptop.
Imho, the key is leveraging the social graph. Which is something parts of Google seem to be clueless about (and parts seem to understand).
How about you review weight based on how closely connected I am to the reviewer?
Pretty much, though I find reviews to be too easy to be gamed to carry any value what so ever. What I would love is for an expert mode that allowed me to filter/sort Play store results by permissions, last update time and release time. I have seen way too many results put apps from the early days of Android up top alongside stuff released the last month, with no way to tell them apart except by opening each entry in turn, and look for data hidden behind the description foldout.
Agreed. The existing search options are also terrible. In the same way that I sometimes have trouble telling Amazon what I want displayed, and fall back on the Top 10 sold list for that category.
Or even a "Has this app been updated since the release date of the version of Android I'm currently running?" option.
How is bootstrapping expensive? If you're just starting off, you can get away with a $499 Mac Mini and a $99 a year developer license. If you don't want to invest that much in your app, I probably don't want it.
On the other hand, I'm a Windows developer and I also would pay about $500 for a decent PC. I also pay $14 a month for a JetBrains Resharper subscription and $30 a month for Pluralsight. -- $528 a year. Well worth the money considering what developers salaries are.
The chance of being a successful solely mobile developer are infinitesimal. I was making more doing Windows Mobile development in 2008 working for a company than most independent mobile developers are making today.
Good point. In my jurisdication Mac Mini starts at €569. That's pretty neat actually didn't realise they were still around! Might pick one up.
Point is though, that this is specialist capital expenditure that I don't need to make for other platforms. Also, as the platform evolves there is a requirement to keep it current.
Even if this doesn't mean I have to upgrade my hardware, it may mean upgrading to a newer version of Mac OS, which may be awful.
Windows development is an area I'm not so familiar with, but so I can't comment. I tend work in specialist niche areas with established domain experts who don't really have the time or interest in learning the minutiae of particular platforms - particularly where these are a moving target. We typically get agency guys to do that kind of heavy lifting.
"Specialist capital expenditure"? All development stacks require a learning curve and money.
Money to buy hardware is relatively cheap - $500. The last time I did mobile development where my company was billing by the hour was 2008. They were billing $125-$150/hour. You could buy a Mac with less than 4 hours worth of billable time.
If you're doing it for a personal project, you could recoup your investment easily with one side gig or by getting a better paying job.
The hobbyists and have-a-go developers aren't billing anything.
Look, personally I could afford 10 of these at short notice if I wanted to. But that's not the point. Growth in the Apps market is slowing. There's a whole market for developers going underserved because the cost of entry in other areas is pretty much '0'.
This is the case I pointed out for Android apps, and the article doesn't make any distinction. The costs for developing Apple apps are too high to make it worth the risk.
"Since it launched nearly 9 years ago, the iOS App Store has delivered more than $70 billion to developers, while downloads have surged 70 percent in the last year alone"
These stats may be "scary", but they sure ain't surprising.
> The number of app store applications is down significantly in 2017, averaging only 500 per day since the turn of the year.
Well, duh. Everyone and their dog already has their own app, so of course the number of new applications goes down at some point. It's called market saturation. Look it up some time.
> ComScore data suggests that half of U.S. smartphone users download zero apps each month
Well, duh. At this point I've figured out the set of 5-10 apps that I actually use, so it's entirely realistic that I will go for another few months before installing the next app. I actually find it surprising that every second customer downloads a new app each month.
> If you Can get users to hit download, according to Flurry Analytics, only 36 percent of apps are retained after one month and only 11 percent for a year.
Well, duh. A lot of app downloads are for one-off usages, like "I need to scan this one QR code" or "I will use this Wifi analyzer to check which Wifi channels are overcrowded and choose the best channel for my router".
> Even worse, says Andrew Chen, the average app loses 77 percent of its daily mobile users (DAU) within the first three days, and 95 percent of apps aren’t used regularly after 90 days.
You can tell that something's very very wrong with the app industry when they consider it a bad thing that they can't get you addicted to their product.
EDIT: One more thing:
> [Pinterest] leveraged massive incoming traffic to their website and then used that as an opportunity to get users to download and install the app, a platform where in Pinterest’s case they could engage with the user even more deeply.
Why does every website ever want me to download and install their app, when the web is just as powerful an application platform for 90% of use cases? When they say "engage with the user even more deeply", the only two things that spring to my mind are annoying notifications and always-on tracking.
"Why does every website ever want me to download and install their app, when the web is just as powerful an application platform for 90% of use cases?"
Yeah, that's annoying as hell. It's like an old "best viewed with IE" all over again. A wast regression from functional and vendor neutral web.
> You can tell that something's very very wrong with the app industry when they consider it a bad thing that they can't get you addicted to their product.
I guess the problem is that the 0–1$ people paid for the app is not enough. The real goal is attention and upselling.
If people had paid 10-20$ it would be fine if they ditch the app, because the creator already got paid enough.
Why does every website ever want me to download and install their app, when the web is just as powerful an application platform for 90% of use cases?
Especially if the app is a really badly made bloated webview wrapper that doesn't work when you are offline and only provides a subset of the website's functionality.
> Why does every website ever want me to download and install their app, when the web is just as powerful an application platform for 90% of use cases?
Because the web is fundamentally more accessible for users, and web sites are easier to build, their web site was where they actually... you know... operated their business. Because of that, the business pushed the web site as far as it could go... embedded advertising widgets, third party integrations, one off marketing pages, random features for this user or that internal department. The thing grew because it was being used.
Meanwhile the programmers programmed as fast as their little hands could type, and drove complexity right off the complexity cliff, because there is no Complexity Cliff sign... at some point you just realize you already went off it and you're headed for the ground.
At that point, the programmers subconsciously realize they need a new codebase, that time-to-feature is increasing superlinearly, but they don't want to admit their code is fundamentally flawed, but what if there was an excuse to start over....
Oh wait, an App! We can use Everything We Learned to try again with a Better Architecture, and really get The Design right and even add a couple New Features we've been cooking up! It's so much easier now to write an app than it was five years ago when I looked into it and it Seemed Hard, let's hire some iOS developers we'll have a V2 app in no time, and it's the Age Of Apps so most of our users will switch anyway. Especially since our web site is having... issues.
Flash forward plannedAppDevelopmentCycle * 3 and oh wait... our app takes 7 clicks and 97 seconds longer to get into than our web site. And only 38% of our visitors have a compatible device. And coding it is not easier. And damnit why are all of these users still using that crappy old feature don't they realize the app is cleaner and the animations are smoother and that makes up for any missing features?
And oh wait, complexity in our app is growing. And this new UI V2 concept our designer said would solve all our sign up flow problems isn't actually improving metrics. And jeez... Our web site is looking really old in the tooth now that we just spent 18 months prioritizing app development.
App developers: what percentage of apps provide no more functionality than just going to your website, but exist only to make it easier for you to collect data about me?
App developer here: there are two answers to that question, and one myth to dispel.
First the myth: apps do not necessarily collect more data about you. In fact as far as sketchy advertisement tracking goes, the web is far more pernicious and gives away much more of your data than native apps.
For one thing, on websites ad networks track not only what you do on that website, but can track you across sites. This is banned in native-land on pain of Apple swinging a very large banhammer on you. App developers are largely only collecting analytics on your actions in-app, and not feeding it into a larger ad network.
Even web views from inside the app (which are often used to show ads) do not have access to the cookies in your standard browser, making cross-app tracking considerably more difficult.
Anyways, to answer your question:
1 - On a strict numerical basis, most apps (70%+?) provide no significant improvement over visiting the website and are largely pointless.
2 - When weighted by installations or usage, most apps provide dramatic improvements over visiting the website and users would significantly suffer if forced into a website-only option.
Which is I guess the obfuscated way of saying: most apps are pointless, but nobody installs them anyway, they just languish in the far corners of the App Store.
The apps people tend to give a shit about (see: Facebook, Google Maps, Spotify, etc) tend to have a significant amount of functionality that is only possible (or only practical) in a native app, and would be significantly worse as a website.
Hmm, where to even start? Some off the top of my head:
- Web doesn't have access to the camera. Uploading a photo to Facebook is easy because you can launch the camera directly in-app, without having to use a separate camera app, save the image to disk locally, and then picking it for upload. Importantly also is the rise of photo filters, which would be impossible to live preview on a website.
- Importantly also, there is an increase in live streaming, which would be impossible from a mobile site.
- Uploading photos/videos is also easy because you can leave the app as soon as the upload begins, the upload will be finished in the background. This is flat-out impossible via a mobile website. With a mobile site you'd at least have to keep the app around and be unable to do anything else with your phone until any uploads are complete.
- Offline/semi-offline interactions are actually possible in a native app. The nature of phones is that it can be out of connectivity, or be in areas with unreliable connectivity, and should still function. With a native app actions performed while offline/marginally online (posting, commenting, liking, etc) can be cached and performed in the background once the phone has better connectivity. A website cannot do this.
- On the consumption side, there are many advantages to the native app:
- The incremental page loads that FB does (and Twitter, and Instagram, and everone else) bog down a browser quickly. Laying out the entire DOM is extremely resource intensive and a major source of poor scrolling performance. Even on modern desktops, try scrolling down a bunch of pages in your FB feed (or Twitter feed), and watch the performance tank quickly until scrolling is like pulling teeth. The native apps are immune to this kind of limitation, since there is no DOM and only what's on screen is actively being laid out (all off-screen content has no impact on render performance).
- No whole-DOM layout also means that the app is more responsive on launch. In app dev we have a concept known as the "cold start time" - which is the latency between tapping on an app's icon and having the app be fully interactive and ready for the user. Webapps do a lot worse in this regard for many reasons: needing to transmit the entire DOM, JS performance issues, layout CPU consumption, etc, that native apps are less susceptible to, or fully immune to.
- In bandwidth constrained situations (which is, well, a lot of the world), a native app passing purely semantic data consumes a lot less data than a webapp that needs to transmit semantic and presentational data. A JSON or Protobuf payload is a lot smaller than a HTML/JS/CSS payload.
- Media is more effectively fetched in native, which is aware of cell vs. wifi, and can customize the resolution/size of content fetched based on network conditions. The browser provides no access to the primitives that it would take to implement this.
- Videos can be inlined in the stream, which is impossible on most mobile browsers, allowing users to quickly preview video content before deciding whether or not to watch it.
- Web does have access to the camera. I regularly upload images to the Facebook mobile site. You can also do live previews of images and allow users to experiment with filters before uploading.
- Live streaming on mobile web is also possible. If you support DASH and HLS you can reach the vast majority of mobile users [0]
- It's possible to support offline apps with mobile web, but it's definitely tricky. The biggest issue for most people is that you can't perform work in the background. Service workers are an attempt at improving on that. But you can still save data in localStorage or IndexedDB and push it up when the user reconnects. There's an offline and online event you can hook into.
- Native apps aren't magical. If you're dealing with large collections on the web you can also virtualize em, look at react-virtualized for an example [1]. That lets you avoid having to create as many DOM nodes all at once.
- You can achieve the same thing by writing your web app as a shell that later loads additional content. That way the initial load feels faster. Check out Progressive Web Apps to see the direction that's headed.
- Mobile web apps usually use JSON too. Nothing special about native apps here.
- Both DASH and HLS fully support dynamic quality. Heck, DASH literally means "Dynamic Adaptive Streaming over HTTP". By supporting both you can target most mobile web browsers. Native apps use these same protocols. It's already implemented by the browser vendor or by native APIs.
- You can jump to the middle of a video stream on mobile web. It's exactly the same as with a desktop browser. Go open up YouTube and skip a few minutes ahead.
It's not the end of the app economy, it's just the app economy is in the long tail mode now. Every obvious app that's going to have wide appeal has already been created. Now it's about filling in the niches. Though there's fewer users in the market for a niche app I'd wager they'd be much more willing to pay for it - so paradoxically there may be more money to be had in these niche apps. For example, I'm a musician and I'll drop $10 on a niche app without batting an eye.
Worth bearing in mind that this is a piece of PR / thought leadership for a media conglomerate that mainly plays in Sub Saharan African countries and India, where data is more of an issue, and people are more choosy about what they keep on their phones. Not immediately obvious on a first read unless you can spot the signs of such a piece.
If you look at this argument through a developed market lens there are obvious weaknesses, for instance the complete failure to address WeChat.
I'm not a app making professional, but it is my understanding that web versions are deliberately sabotaged to coerce people into downloading the app. (so they can steal more data? Dunno)
That absolutely seems to be the case. Which is why I have such a problem with this article discussing the benefits of creating quality app and web experiences, then holding up one of the worst web-to-force-app-installs as a paragon of good design.
This is so stupid. I mean, I don't need the apps, so any kind of coercion is pretty much doomed to fail. I don't usually need LinkedIn, and I definitely don't "need" Pinterest, ever.
Making their websites suck more to encourage me to install some never-used app just means I use their whole product less, making it almost impossible that I'd ever get invested enough in it to install an app for it. Conversely, if their websites were awesome, it might make me use them more, which would only increase the likelihood of me wanting to go the extra mile to even install an app.
I would guess that we see this because the folks with their hands on the levers aren't measuring failure: their metrics are all targeted at success (app conversions). So anything they do is calibrated to move that needle without awareness of (or caring) how many users they're losing.
... Or at least, that's the best explanation I can come up with for why some companies seem so deadset on shooting themselves in the foot.
And it's very frustrating that Google Images ranks Pinterest results so highly, usually in the first and second rows. What's the use of a search result when I can't actually click-through the result?
Here is why the startup I write code for wants an app first and browser code second.
We actually would prefer the browser by leaps and bounds, cutting the need for central servers and decentralization is a core idea, you should remain in control of all your data.
The problem: The only viable way to store a significant amount of data in the browser is IndexedDB. Which allows you only some storage of the total available - and, much much worse, the browser can wipe the data at any time without warning.
It simply is not possible to provide our data-storage related system without either a central server backing up the browser "app" - or writing an app for iOS and Android.
I sure hope so. We need to improve the web experience on mobile, not foist dozens of mediocre apps onto users.
Its telling that HN has one of the best mobile experiences of all the sites I regularly visit on my phone. I frequently find myself jabbing at the screen in frustration when operating more "complex" sites.
There is a much more successful app economy that's been around since the inception of the personal computer which is doing ok. Some are SAAS apps others are little tool apps like my own, even others are open source apps and yet others are shareware.
In other words. As long as you don't define app economy only as what happens on the mac app store then the app economy is doing ok and we haven't seen the end of it.
However if you only think in terms of apple app store then I would say it's never been there to begin with.
There are some larger trends I believe that renders the idea of apps as products useless but that's for another topic.