IMHO it's the best language I've used a significant amount, without a close second (where my sample size of extant languages is Swift, ObjC, Python, JS, TS, PHP, Java, Kotlin, C#, Haskell).
It's the only language I've used where I have never asked "but why?". Every language feature and stdlib API just does what it says and acts in the way that makes the most sense. You're basically forced to handle every case (null, any type of error, switch cases, etc). It's highly dynamic, yet code tends to stay understandable. Apple is thoughtful about naming things and designing APIs. It's the only language I almost never need to write utility functions for - generally the type you're operating on has a method to do what you want, which you can easily discover with auto completion. Strings are sequences of graphemes, not code units or code points, and their API reflects this.
It's not perfect, but nothing else comes as close. One of the most prominent complaints - compile time explosion - really only happens in contrived examples where you're using a bunch of literals in one expression, which isn't realistic.
However, it's stuck in a single ecosystem. And let's not talk about some of the newer frameworks. Oh, and Xcode is a joke.
And the typesystem is built in a way that it does not really help you due to typechecking being so slow.
Make an error/begin refactoring in ocaml: just follow the typechecker until you’re done.
Make an error/begin refactoring in swift: you plan ahead because the typechecker will crash with a timeout or give you non-root-cause errors.
Yesterday I changed the signature of a method, and instead of complaining at the callsites I got a ”ambiguous method” error for a built in SwiftUI view modifier. Kinda hard to track down if you’re not completely aware of the changes you’ve made
This falls under the "new frameworks" category I glossed over. SwiftUI, SwiftData, etc, are failures in my opinion. SwiftUI in particular abuses the type system. In your own code the type system is very helpful.
Yep. If you’re writing good code that avoids common smells things are perfectly speedy. Usually if you’re making the compiler grumpy you should be sitting back and reconsidering your design anyway.
XCode (and other tooling) being a joke and compile issues make it tough for me to think of it as my favorite, but the language itself is definitely nice to work with.
The compile time issues I have mainly come from TCA, but I still see them from time to time in normal code.
I have a few other syntax gripes, but largely see it as competitive with Typescript and Rust in terms of how nice it is to use. TS and Rust blow it out of the water with tooling, though.
I'm optimistic about Rust and look forward to trying it.
TS to me is just an awkward patch on top of JS. It's really great! But only compared to JS. The mental model of its type system is difficult to form completely.
It might be due to the convenience of familiarity, but I disagree about TS. If you forbid any/unknown it doesn't feel at all like an awkward patch, and as far as I can tell the type system is much more powerful than Swift's.
The main problem with TS is that your types are compile-time. You can have 0 uses of any/unknown, but you can still stuff a number into a string at runtime without any issues until it blows up.
I know you should have defined validators for every single external API, library and database, but at some level it feels silly to do all of this extra work.
Of course, but that comes with the territory. I just think it's really impressive how far TS has come, and doesn't feel like an "awkward patch over javascript" at all these days.
In new projects, I've found that it's now very rare to come across a library that doesn't provide typings, or violates the interface.
Not my experience at all. I develop daily in Xcode and compile times are excellent. One can't really compare between TS/JS interpreted languages and machine code generation. Apples and oranges. Swift compiler is fast, faster than C++. In my experience the Golang compiler is the fastest but the language is much simpler. Rust compiler is slower than C++.
As a side note, for some reason people like to bitch Xcode just out of habit perhaps. It's a kind of base line statement. Not my experience either. It's an excellent IDE for the current year of 2025. Helps me a lot and I build end-user facing apps, the proof is the love that users give to my apps. In other words, I have skin in the game.
The compile times are ok, but slower than Rust and C++ in my personal experience. What I was referring to with "compile time issues" is the occasional type resolution issue that hangs the compiler for seconds/minutes, and requires some refactoring to mitigate. This does happen a lot less with Swift 6 than it did with 1-4 though.
I've used a lot of IDEs, and spend plenty of time using XCode. It's the worst I've used by far, and the only one where I feel it necessary to edit the majority of code in a separate editor.
Writing apps that people love makes you a good developer, but says little about the tools you use.
Sometimes people mistake Swift for Xcode and vice versa. If compiler can't figure out type in reasonable time, give it some room to breathe by adding a clarifying type definition. I don't like to fight the compiler trying to bend it to my will. I am happy to offer extra help by clarifying types – magic costs performance. Compiler gives back by blazingly fast builds.
By the way, I am impressed by speed of Swift 6 project builds on Apple Silicon Macs. My non-trivial apps using a mix of Swift/C++/C files (hundreds and hundreds) get compiled in nearly real time. Feels good.
Mistaking Swift for XCode isn't the issue here. Swift is great, I'm just pointing out the issues I've had with it in large projects.
> ...give it some room to breathe by adding a clarifying type definition
Right, but that's the problem - it's the only language I've used where this is something you have to worry about, and it's not like the type system is more powerful. And often, the issues stem from external libraries outside my control. Better compiler errors would help mitigate the issue.
There's a lot to like about Swift, but I also think it's productive to identify the parts that compare unfavorably to other languages.
Xcode regularly crashes and/or ends up leaving garbage in its garbage xml files. The un-reviewable xml stuff is also a garbage of its own making. The build system is intractable. Xcode does not lend itself to segregating builds from dependency fetching, i.e. do all the internet things on one machine and donthe building on a different one. The debugger cocks out at least weekly. Sure, you can make goods app with it. We’ve been making good apps with less for decades too. Other platforms/editors are not as painful, it is as simple as that.
The only thing XML Xcode generates are storyboards and xibs (and schemes also). All of those XMLs are readable. Xcode’s build system is very reasonable (though different than on other IDEs, but that does not mean bad!).
How interesting! Xcode very rarely crashes on me. I don't know why but it seems to work 99.9% of time the way I expect it to work. Its build system makes sense to me. Of course, it differs from other build systems but that's life, I am flexible and can learn things. Just saying.
Because it's annoying to have to implement it every time you make a one off script (or small projects) among other things.
And it is usually a pretty basic iterator standard to have it so that someone maybe not as familiar with concepts like reduce can just sum the sequence.
Or if someone not familiar with reduce read your code, and is like "what is that `.reduce(0, +)` thing everywhere!?" while instead it should be a no brainer `.sum()` <-- "yup make sense"
It is not stuck on a single ecosystem and VSCode has an extension to use Swift in it. The Linux ecosystem is actually getting quite interesting tbh, and they even support embedded targets!
"Liking" is irrelevant to me. I do use it thoroughly though – I have written Linux containers (web apps) as well as native macOS and iOS apps. I am end-product creator and want my ideas to become end-user products and see the light of day. Delivery is everything.
Swift offers convincing ease for writing abstractions that underpin the app layer. You can be as clever or as simple as you wish, it's up to you. I prefer to keep it simple.
The other huge win for me is its ability to combine multiple language codebases at the SOURCE CODE LEVEL under a single project umbrella.
Many of my projects use a mix of:
- Swift
- C++
- C
The whole thing gets compiled from a bunch of source code folders with a single command. No wrappers or binary wrangling, simply call C++/C from Swift.
Compile time is really fast for me with Swift 6 onwards.
I use Xcode (and I am fine with it, it compiles really fast in year 2025) or VSCode (with Swift plugin) — both achieve the same thing in my experience.
Want to build a Docker Linux container? Very simple too. I fine tune and debug the project on dev macOS machine then use a Docker Linux container to build it for Linux (on a Mac or Linux) and deploy the binary. It's a single binary file by default, all libs inside. Then copy it to my NAS or deploy to remote Linux server and enjoy the speed.
Low friction, easy abstraction, fast delivery and binary machine code executable. That's what speaks to me.
Personally I initially come from the « Apple world » but now I use Swift on the server and it’s a joy. By far my favorite language ever. I even wrote a tool to be able to run Swift scripts that have non-system dependencies![1]
Swift as a langage is fine, the problem is the ecosystem. It's been driven by apple for too long now, and suffers a lot from it.
You would think that being constrained by one company would ensure the project remains lean, but apple actually had the opposite effect. Instead of focusing on real cross-platform support, or strong language foundations, they kept piling on features that nobody asked but made nice developer conference demos.
That is probably the opposite of true. Go read the forums once in a while, the foundations are really well thought. I agree that function builders were probably added for the shiny SwiftUI (and was added too soon), as well as the property wrappers, but apart from that, the process of adding features in the language is actually quite transparent and well-defined.
Also Swift now has real cross-platform support. All of the libraries I create I test on macOS, Linux, Windows, android and WASM/WASI.
i tried building cross-platform libraries in swift for the past two years, with no success. I just did it in rust (ios / android / wasm) in a week, knowing nothing about the language.
the swift wasm project still uses a forked compiler IIRC…
Like I told you, I cross compile to all of the targets directly on my Mac (except windows that I do technically on my Mac, but on a VM). I could use Linux all the same.
Swift now has the concept of Swift SDK (`swift sdk install …`) and it is possible to target whatever (that has an SDK) easily. Yes, they took their sweet time, but it now works properly.
Interesting, thanks. Do you have a resource that shows basic examples for android & web ?
Rust also has a very extensive set of tools to generate bindings ( wasm bindgen & uniffi), and a wide ecosystem of crate for all platforms (rustsqlite, web_sys, reqwest, etc).
What's the experience like in swift for you ? What part of the swift libraries ecosystem can you use in practice, and how does bindings work ?
Let's say i want to expose an async call to a local sqlite instance on android. How would that work, and how would that look like from kotlin pov ?
Also, how is the size of the wasm packages ? how much cruft does swift need to embed ?
PS: reading the thread a bit more, it seems like you still need some special options, and some specific version of the toolchain. Is it still the case, or can i try it with the default toolchain distributed with xcode ?
I find it intriguing but have so far been unwilling to convince myself to give it a try on anything. It has a lot of good ideas, but I think Apple needs to relinquish more control over its future and direction for me to feel like a future plan change at Apple won't jeopardize its usefulness outside of Apple platforms. Presumably the reason why they want to keep it under their own organization is specifically so they can control their own destiny better, since Apple is heavily using Swift themselves; totally understandable, but trusting it for something that needs to be crossplatform in the long term seems potentially unwise.
It's not fool-proof either. Microsoft started the .NET Foundation, but that hasn't stopped them from causing drama by pushing self-serving decisions from the top-down. I don't really fear the same sort of behavior from Apple very much, but I definitely worry that Apple might eventually lose interest on the cross platform part for sure.
This is especially troubling because it is a fairly innovative language. If you get trapped on an unmaintained port of it, moving off of it seems like it might be hard. It's got a lot of clever ideas I haven't seen elsewhere.
Swift evolution and governance is open and publicly documented. It will always be dominated by Apple engineers, but the evolution of the language is largely orthogonal to Apple's OS releases.
I'm not sure how much of the standard library is available on the server side.
However, I it's more about the engineers' interest than it is Apple's, and in that respect, the Swift ecosystem has been evolving constantly, e.g., the Swift toolchain was entirely divested from Xcode a month ago.
I can't speak for the .NET ecosystem, but your fears are unfounded. Whether Swift is useful in a cross-platform context is another question, however.
Orthogonal? Odd thing to say given Swift's evolution and release timeline are entirely constrained by Apple's OS release schedule. We're currently going through the spike in evolution proposals by Apple engineers in preparation for the branching of Swift 6.2 next month before WWDC in June.
As for server side, the standard library is entirely available on other platforms, with a subset available for embedded Swift. However, it's fairly limited when compared to something like Python, and cross platform support for the other libraries like swift-foundation or SwiftNIO is more limited (IIRC SwiftNIO still doesn't support Windows properly).
I'm not sure what you're talking about with the tool chain. Apple has been producing toolchains that can run on macOS outside Xcode for years. Do you mean integration of swiftly? I think that just brought swiftly support to macOS for the first time.
Ultimately I agree with jchw; Swift would be in a much better position if it wasn't controlled by Apple's process. Features could get more than a few months work at a time. We could have dedicated teams for maintenance intensive parts of the compiler, like the type checker or the diagnostics engine, rather than a single person, or a few people that switch between focus areas.
Firstly, I believe the fears are founded; these fears are a good starting point, since learning and adopting a programming language is a big investment, and you should be careful when making big investments. To say they're unfounded suggests that they have no basis. Disagreed.
Secondly, I don't really feel like this sort of analysis does much to assuage fears, as Apple's business strategy is always going to take priority over what its engineers individually want. Apple of today doesn't have any obvious reason to just go and axe cross-platform Swift, but if that ever changes in the future, they could do it overnight, like it was never there. Could do it tomorrow. It's not much different than an employee getting laid off abruptly.
This is especially true because in truth Apple doesn't really have a strong incentive in the grand scheme of things to support Swift on non-Apple platforms. Even if they use it in this way today, it's certainly not core to their business, and it costs them to maintain, costs that they may eventually decide benefits their competitors more than it helps them.
There's no exact heuristic here, either. Go is entirely controlled by Google and does just fine, though it has the advantage of no conflict-of-interest regarding platforms. Nobody writing Go on Linux servers really has much reason to be concerned about its future; Partly because Google has quite a lot of Go running on Linux today, and given how long it took them to e.g. transition to Python 3 internally, I can just about guarantee you that if Go died it would probably not be abrupt. Even if it was, because of the massive amount of external stakeholders there are, it would quickly be picked up by some of the large orgs that have adopted it, like Uber or Digital Ocean. The risk analysis with Go is solid: Google has no particular conflict of interest here, as they don't control the platforms that Go is primarily used on; Google has business reasons to not abruptly discontinue it and especially not on Linux servers; there are multiple massive stakeholders with a lot of skin in the game who could pick up the pieces if they called it quits.
I believe Apple could also get to that point with Swift, but they might need a different route to get there, as Swift is still largely seen as "That Apple Thing" for now, by a lot of outsiders, and that's why I think they need to cede some control. Even if they did fund a Swift foundation, they could still remain substantially in control of the language; but at least having other stakeholders with skin in the game having a seat at the table would do a lot to assuage fears about Swift's future and decouple aspects of governance from Apple in ways that would probably ultimately benefit Swift for everyone.
P.S.: And I'm not singling Apple out here, because I think any rational company has to make tough decisions sometimes, but it's obvious from their past that they definitely don't fear changes of plan. Look all the way back to OpenDoc. Being willing to make bold changes of plan feels like it's a part of their company DNA.
It's a great language for Application-level software on a single platform (could be Linux).
It has some challenges that it needs to solve to do great as a cross platform "general-purpose" programming language.
It's paradoxically high level with its syntax and ergonomics but is tied down to the same cross platform headaches like in low level languages (e.g. cpp). Linking across cross platforms requires lots of careful thought and testing. Unlike cpp, it's not super portable. It requires a hefty 30 MB runtime for some features of the language to work. Try static executable hello world.
That being said, it's possible. You can build cross platform applications with Swift, but you'd still have some of the same kinds of portability issues like in cpp but with nicer syntax and ergonomics.
PDF is a famously (and hilariously) wild document format because it satisfied the need of being able to recreate a work piece faithfully using thousands of kinds of outputs, some of which didn't even exist when the document was created, to ideally arbitrary pixel resolution (see https://www.youtube.com/watch?v=qbCniw-BcW0 for a delightful and informative talk including this topic).
As a result, in one of the modes of PDF you can save the entire font file for every font used by the PDF into the PDF itself, just in case it's not present on the recipient's machine. Costly? Sure! But what else are you going to do if your document uses a super-special font for displaying mathematical symbols or sanskrit or the glyphs of a language understood by fifty people on the planet and Unicode isn't widely adopted yet, having been invented just two years before PDF?
So in this case, the author grabbed a copy of a PDF version of the ad (because those ads are still available online), cracked open the document itself, and found the glyphs for the letters are sourced from a version of the font that was intentionally created to steal someone else's font work because the whole font file is in the document.
Actually the PDF does not contain "the entire font file for every font used" but only a subset, containing the definitions of only the characters (glyphs) being used in the document.
That's why the font appears not as "XBANDRough" but as "<6-random-uppercase-letters>+<original-font-name>"; this indicates it's only a subset of the original font.
>Sure! But what else are you going to do if your document uses a super-special font for displaying mathematical symbols or sanskrit or the glyphs of a language understood by fifty people on the planet and Unicode isn't widely adopted yet, having been invented just two years before PDF?
Assuming it's for print/display and not future editing, I imagine you could convert the font strokes to vectors or similar.
You could. But in a thousand page document, that's a lot of memory used up to record a vector for every letter 'c'. So of course you do two layers: record modifiers and transforms on a canonical 'c', and then keep a canonical 'c' somewhere with all the other letters.
... But you already have that data structure: it's the font file itself.
(Possibly worth noting here also is that historically, Adobe owned both the PDF format and the file format for most popular fonts. So they were heavily incentivized to just reuse code they already owned here instead of reinventing a wheel).
Correct me if I'm wrong, but if the DOJ requires Chrome to be sold off ... they would also not allow the new owner of Chrome to get revenue from search engines to be the default search engine.
In which case, what is the monetization model for the new owner of Chrome - other than just buying a daily portal where users go?
The DOJ doesn't require anything. They are the ones arguing for Chrome to be sold off. The federal court is the one that would require a particular remedy outcome to the anti-trust conviction.
> they would also not allow the new owner of Chrome to get revenue from search engines to be the default search engine
There would be no such mandate. Google will be allowed to pay the new company to be their default search provider. And other search providers can bid on that opportunity as well.
Google itself just cannot own the business end-to-end as it does now.
But related, isn't the DOJ targeting Mozilla for exactly what I described above ... that because Google can pay so much for being the default search engine - it's not creating an environment for fair competition.
One important point that’s not obvious from this article: U.S. pollution from fossil fuels isn’t actually decreasing.
From what I understand (and please correct me if I’m wrong), overall energy demand in the U.S. continues to grow year over year. Most of the additional energy needed is now being supplied by renewables.
So while we’re adding less new pollution—because the new energy is cleaner—we’re still producing the same amount of fossil fuel pollution as before.
The baseline pollution hasn’t gone down; we’ve just slowed the rate at which it increases.
In the electricity grid specifically they peaked around 2006 and are 15% below 1990 due to switching from coal to gas and introducing renewables
What you say is broadly true globally but western nations are mostly on the downslope and the globe as a whole is slowing and hopefully going negative soon.
If you’re convinced by the materials shared, you may want to consider editing your original content. It’s currently the most-upvoted comment, and is materially incorrect.
I strong suspect that Trump fully intends for tariffs to cripple renewables (they will) and boost oil and gas jobs. The dude is hell bent on returning America to the 1950s.
Indeed, politicians in the employ of the (traditional) energy sector have gone from mocking renewable projects and decrying them for "requiring subsidies" to demanding their curtailment because they do not.
Alberta's government (whose premier is a former oil&gas industry lobbyist) enacted a "moritorium" on new renewables projects a couple years ago because it had the most active investment in that sector in the country (most sunny days, and very windy). After the moritorium was lifted draconian regulations were placed on potential new sites.
This was done under the cover of "protecting farmland", but this is in a province with a massive abandoned oil well contamination issue, which the private sector got away with and continues to get away with.
And then today/yesterday it came out that the government had hidden the results of public "consultations" on these matters because it was not favourable to them.
The problem with oil&gas is it is prone to the development of parasitical/highway-man type relationships, and all sorts of people get rich quick by inserting themselves in the flow and they will not give up this position without a dirty, dirty fight.
One of the ironies in the US is that some of the more conservative states with supposedly renewable energy hostile governments are actually deploying more solar than many liberal states. When people want to oppose a solar install for whatever reason they often turn to environmental laws, requiring impact studies or other such red tape, that are much weaker in conservative states. Texas is a champion of renewable installs despite a government that is openly pro-fossil fuel.
The US had at least one year of solar PV deployment capacity in reserve before tariffs went into effect, ~50GW. It also has roughly the same amount of domestic manufacturing, but not fully vertically integrated. Gas plants are backordered well into the 2030s and coal plants (which only produced ~15% of electricity in the US in 2024) are teetering near retirement. To push either to failure would not be hard, you'd just have to target (legally, economically) specific parts of the supply chains needed for thermal generation construction (Siemens or GE Vernova) and operations/maintenance.
Tariffs and economic uncertainty are pushing down oil prices. The US O&G industry experiences pain below $70/barrel. It is not having a great time under this admin, as of this comment.
According to EIA[1] and Wiki[2] fossil fuel electricity production has been rather stable or going slightly down on the last three years. However, US electricity is a small part of the US Energy Consumption. When you take those into account[3], renewables fall from 50% of the energy mix to less than 20% (10% if you don't include nuclear) but overall Fossil fuel usage is stable. The trend is going down if you take a longer time period.
> overall energy demand in the U.S. continues to grow year over year.
That probably not unique to the US, but I wonder what that energy is used for. Our appliances use less and less electricity, our homes are better insulated, cars are more fuel efficient, so what is using that additional energy?
It does not attempt to explain it but this lists how much major sectors use:
https://www.eia.gov/energyexplained/use-of-energy/
This data presentation shows it seems to have broadly plateaued in last few years with some fluctuations that do not indicate an upward trend as of 2023.
Electrification (converting things that ran on fossil fuels to electricity) is a net decrease in energy demand but massive increase in electricity demand if you're looking at electricity specifically. Past few years have also had a ton of growth in data centers and manufacturing investments. According to the EIA, they're projecting a decrease in energy demand over the next few years: https://www.eia.gov/todayinenergy/detail.php?id=65004
Commercial and industrial use, as well as increased AC usage. I'm not sure what percentage of the electric market is now for charging cars, but it's not zero.
You see similar use patterns for water. Per capita water use has gone down in many urban areas, especially places like California over the past few decades. You see hard restrictions on watering lawns, showers, toilet sizes, etc. But agriculture just sucks up the rest.
This is true when comparing the same class of car to previous models, but is it still true overall when we consider the shift to larger vehicles? The Toyota Camry from 1994 is still more efficient than the current 2024 top selling Ford F150.
Jevons paradox is worth considering when reading these stories of increased renewable energy generation and falling prices.
Yes! Automobile fuel efficiency is now probably the classical example of Jevon's paradox: some cars are more fuel-efficient; and (therefore?) we in the United States are driving 35% more miles each year in 2024 (3.3e12) than in 1994 (2.4e12).
I don’t know but there has been a lot of manufacturing construction after the pandemic. Manufacturing construction has doubled since 2021 to the highest levels since 2005. Presumably these projects consume power for both construction and operation. There’s probably also some amount of new industry becoming viable as energy prices fall.
Is that a fact or a supposition based on fossil fuel generation continuing to grow? Pollution produced by fossil fuels isn’t necessarily equal. Modern power plants are significantly less polluting, and gas is much better than coal. If coal continued its decline and newer plants replaced some old ones, you could easily have less total pollution from increased fossil fuels generation.
I realize what I'm about to say will get backlash ... but I can't help but think is the time to write this postmortem indicative of how the business was run.
Meaning, who benefits from the output of this postmortem? Seems like mostly strangers (who might not even live in UK).
What other time/effort/resources was spent on things that weren't directly engaging with their customer ... because it seems extremely clear without knowing much about that market that this isn't a technical challenge per se - but a regulatory / social problem and the modest amount of capital they raised won't even scratch the service on solving this problem.
Note: not intending to be negative. It just seems like the elephant in the room is that the team was so ill-prepared and not understanding what actual problem they are solving - that my heart goes out to them.
> Meaning, who benefits from the output of this postmortem? Seems like mostly strangers (who might not even live in UK).
Seems to be a piece of content marketing intended to help the two founders land a new role in the US so, in that sense, it does seem pretty strategic and well targeted.
> this isn't a technical challenge per se - but a regulatory / social problem and the modest amount of capital they raised won't even scratch the service on solving this problem.
Stripe hasn't fully fixed online payments but still made a good business of making things better.
At a high level, SaaS to help people filling out planning permission forms sounds like a viable business. Many thousands of people do this as their full time job, so their employers might be willing to pay £100 per user per month on something that makes them more productive.
they develop models around a very defined set of used cases, and they are very good at it. Look through their documentation and throughout their API. It’s very opinionated and quite a delight, honestly.
They just want the next wave of Ghibli meme clicks to go to them, really.
This will be built on the existing thread+share infra ChatGPT already has, and just allow profiles to cross-post into conversations, with UI and features more geared toward remixing each other's images.
I actually would love this. I hate having to go to another website to share some thoughts I had using tools in a platform.
I miss the days when experiences would actually choose to integrate other platforms into their experiences, yes I was sort of a fan of the FB/Google share button and Twitter side feed (not the tracking bits though).
I wasn't a fan of LLM and the whole chat experience a few years ago, I'm a very mild convert now with the latest models and I'm getting some nominal benefit, so I would love to have some kind of shared chat session to brain storm, e.g. on a platform better than Figma.
The one integration of AI that I think is actually neat is Teams + AI Note taking. It's still a hit or miss a lot of the time, but it at least saves and notes something important 30% of the time.
Collaboration enhancements would be a wonderful outcome in place of AGI.
The answer seems more obvious to me. They dont even care if its competitive or scales too much. xAI has a crazy data advantage firehousing Twitter, llama FB/IG and CGPT just has, well, the internet.
Id hope they have some clever scheme to acquire users, but ultimately they want the data/
reply