Hacker News new | past | comments | ask | show | jobs | submit | DigitalSea's comments login

This is like watching a carpenter blame their hammer because they didn’t measure twice. AI is a tool, it's like a power tool for a tradesperson: it'll amplify your skills, but if you let it steer the whole project? You’ll end up with a pile of bent nails.

LLMs are jittery apprentices. They'll hallucinate measurements, over-sand perfectly good code, or spin you in circles for hours. I’ve been there back in the GPT-4 days especially, nothing stings like realising you wasted a day debugging AI’s creative solution to a problem you could've solved in 20 minutes.

When you treat AI like a toolbelt, not a replacement for your own brain? Magic. It’s killer at grunt work like; explaining regex, scaffolding boilerplate, or untangling JWT auth spaghetti. You still gotta hold the blueprint. AI ain't some magic wand: it’s a nail gun. Point it wrong, and you’ll spend four days prying out mistakes.

Sucks it cost you time, but hey, now you know to never let the tool work you. It's hopefully a lesson OP learns once and doesn't let it sour their experience with AI, because when utilised properly, you can really get things done, even if it's just the tedious/boring stuff or things you'd spend time Google bashing, reading docs or finding on StackOverflow.


So apparently, Sam Altman is so skilled he can cast magic spells on people? This just all adds credence to the fact he was fired because of a coup.


Or he is great in manipulating people so it's hard to point at a single proving example


Depends if he was entirely candid about which spells he was casting on them I suppose.


most people don’t save their receipts. it is a hassle until you need it.

most interactions aren’t between “the good side” and “the bad side.”

there seems to be a hesitance to entertain the possibility that a bad board pulled a coup on a bad ceo. harder to pick sides.

more likely than a good board pulling a coup on a good ceo, but that’s also possible. everyone got too afraid and now they hug it out.

so, no, not magic. just mass amounts of detail outside the public eye. makes for a lot of possibilities that people will speculate about online. even argue over it.


The fact the new CEO can't even get answers from the board is quite telling. Looks like the OpenAI board wants those investor lawsuits. And allegedly the Quora guy Adam D'Angelo is the ringleader of all this?


This whole saga has helped me see how rumors grow. (And I know you used the word allegedly, but still.) First it was Ilya who was the ringleader. Now it is Adam. There has been a small amount of new information (Ilya seems to have backtracked), but there is no real information to suggest Adam was a ring leader. It is the pure speculation of people trying to make sense of the whole thing.


There is no evidence that Adam is the ringleader.

All four are possible ringleaders.

Given Ilya's change of heart he is slightly less probable as the ringleader.


I have no evidence, but I do have faith that anyone who turned Quora into what it is today could totally be the ringleader of this clusterflack


Adam runs a clone of chatgpt (poe platform). It's right there on his Twitter account. Isnt this conflict of interest and motive?


First the board allowed the For profit corp to form, and now is firing the guy that did it. Second, they allow a board member to build a competing startup. What kind of AI safty/alignment/save the world crap is that?


yes, yes it is.


The detective from Knives Out would have solved this by now


> Given Ilya's change of heart he is slightly less probable as the ringleader.

I tend to believe that was exactly his strategy with his change of heart...


I'm not sure if lawsuits against the non-profit will be possible, as the investors didn't invest in it. More likely, making public the facts behind who was responsible for the shenanigans and what evidence they had (if any), combined with pressure from employees, will force their hand.


Or Dustin Moskovitz, it seems many of the board members may be linked to him


No https://www.threads.net/@moskov/post/Cz482XgJBN0?hl=en

"A few folks sent me a Hacker News comment speculating I was involved in the OpenAI conflict. Totally false. I was as surprised as anyone Friday, and still don’t know what happened."


HTMX is cool, but it's honestly only in the conversation because the developer has leveraged memes and garnered popularity on Twitter.


yeah, i'm not google of facebook or ny times or whatever, i'm a lone dev in montana

what other options did i have?


People don't realize that many of the dev tools they use they are using more because of the large marketing reach of those dev tools than any objective decision to use the tool.

Entire companies have been built around the premise of good developer marketing, and if you're faang, you simply get to impose whatever developer trends you want to see in the market.

I'll never forget how wildly popular Stripe became overnight because of how easy their SDK's were - people were happy to give them a higher % of each transaction (all the stripe competitors at the time were cheaper -- this isn't true anymore, but was at the time). It always blew my mind that people were so willing to give up a % of each transaction to save an extra couple days of development.

Developers in general are notoriously susceptible to marketing trends and if you're building a dev tool that you want to gain traction you absolutely have to play that game.


Yep. Vercel has raised $300m.


I am not saying HTMX is terrible or anything. What you've built is cool and I've built something with it. The point I was making wasn't made very well. What I meant was good options are often left out of the conversation because of React, Vue and for a while there, Svelte. There are a lot of great libraries and frameworks that nobody talks about, HTMX included. I just feel like HTMX isn't being hyped because it's good, but because of the memes/marketing aspect. I think it's a disservice to your work, which deserves to be assessed on its merits. It's a sad indictment on front-end that building something good is no longer good enough to get recognition.


yeah, it is a little unfair

i've tried to produce a lot of technical content, arguing for htmx on its merits:

https://htmx.org/essays

https://hypermedia.systems

but the reality is that marketing is what gets people to that content. I tried for years to convince people on pure technical merit alone, and only made halting progress.

i also got very lucky that a few things all came together at once:

* the primeagen & fireship_dev both covered htmx * we released our book * the twitter algorithm changed to boost funny stuff/memes


I'm old enough to remember when servers rendered everything and you used CSS and Javascript to enhance the pages after they were rendered. The web is in such a dark and overengineered place. It's almost unbelievable. It's why my approach to building apps is server-rendered first and then enhanced after the fact.


I agree for the most part. Collapsable dropdowns and jQuery drag-and-drop are nice-to-haves, but I also clearly remember the hellscape of state management in js/jQuery and I'm not to keen on going back to that.


To this day I don't understand people saying this. Can you describe a project or instance where it was tough? I never had any issues even in complicated projects with thousands of lines of JS.


Developers didn't understand event delegation, so entire codebases were filled with $.live, on and off events. If you componentized behaviors with jQuery it wasn't all that bad, although I'll admit it could get somewhat gnarly.

The problem, from what I saw was nested forms and trying to hold state on the client when the server should have been asked for the state. This lead to template duplication and trying to hold too much state on the client.


Early JS state management was horrific, yeah, but modern libraries (while a little hard to wrap your mind around initially) are really powerful.


You guys are using Javascript?

I jest; our first geocities page was just HTML with maybe a visitor counter, marquee and whatnot.

Our first PHP application / project in school actually didn't use JS yet, it used frames for a static menu and header and just straight form submission to get data to the back-end. Those were the days.

But when I did my first college level internship (1 year of internships is part of college education over here), it was Java back-end, JSX templates for the presentation part, and it was enriched with PrototypeJS for things like dialogs and an animated accordeon (back when animation was still "update the height of this element a couple times a second").

My first job involved using a lot of JS to still enhance a page; add to cart, image carousels, that kinda thing. jQuery era.

And my next job involved building a user interface for customer support staff to look into SAP or something like that, poorly built in BackboneJS.

The next assignment was once again using BackboneJS to rebuild the investment banking front-end for customers. That was - as with most applications I've built btw - a great use case for single-page applications (as they were called then). No SEO needed, fast enough to render purely front-end, API heavy (that was also the time when people realized you could build one API for both web and mobile), etc.


You guys used PHP? I started out with include directives with Apache, *.shtml files, and cgi-bin. SEO didn't even exist yet. You were either in the Yahoo! registry or you weren't. You just put your website URL at the bottom of the printed glossy brochures at the trade shows. "Mobile" wasn't a thing. Almost no one had any kind of cell phone. Pagers were still common. PalmPilots were brand spankin' new, a few hundred dollars (not including the 14.4kbps modem), and did not have web browsers.

(I'm not gonna claim things were better back then.)


As a different perspective, I'm also old enough to remember when CSS and Javascript were invented, and I've been making websites since that time.

IMO web dev has never seen better tooling than today, and user experience has improved tremendously over the years.

What we used to call AJAX has grown from a neat side toy to a basic part of everyday life in the form of client components and SPAs. The server is still as powerful as ever if you want it to be. But having such awesome client-side power is nice for interactive apps like dashboards, maps, games, forums, office apps, online IDEs, etc. It enabled the wholesale migration of everyday apps from bespoke desktop apps for each OS to a universal platform across all laptops and desktops.

All that power of course required more complexity. It's very different trying to write a blog or landing page in HTML/CSS vs trying to write a whole web app. Angular and React were invented to help develop apps that were several times more complex than their precesssors, at a time when JS runtimes (and the language itself) were still really primitive compared to the server side languages of the time.

Yeah, there was a really painful period there in the late 2010s where different JS frameworks each solved a tiny part of the problem. These days it's less of an issue. Next won, became the default, and deservedly so. It's really good, and has an appropriate level of abstraction for mid complexity apps, and allows a good mix of server-side rendering and client-side pages. Adding React Server Components makes that division a cleaner first class citizen.

That only makes sense at a certain complexity though. If you don't need it, don't use it. If you're making a largely static blog or documentation site, there are simpler architectures. You can still use HTML and sprinkle in a few lines of JS as needed. You can still use WordPress or Wix for most small business needs.

But if you're building more complex apps, god, React is an absolute dream compared to trying to round-trip every minor interaction to the server for recomputing the UI and sending over an entire HTML page every time, losing context and page position and half filled forms and whatever. It also encouraged the use of form data as state, and work was often lost on an accidental back button or the frequent server crashes before the advent of trivial cloud scaling.

IMO it's only overengineered when misapplied. Some of these tools are really useful, even essential, in the proper use cases. Maybe the sad part is that we overteach them and encourage their use even when they're not necessary (or perhaps counterproductive). Right tool for the job and all that.

Edit: not really pitching React over Vue or Svelte or HTMX or whatever, just that client-side complexity has its uses. Pros and cons, not strictly better or worse.


Well, React certainly has succeeded in increasing the client-side complexity, that's for sure. They seem to be making good inroads on increasing server-side complexity as well. Kudos to their contributions to excess CO2 emissions!

I have yet to see a React project with more than five contributors fail to turn into a big ball of mud within 18-24 months, requiring either a periodic rewrite or resigned acceptance of trudging through large volumes of mud to get anything done.


> I have yet to see a React project with more than five contributors fail to turn into a big ball of mud within 18-24 months, requiring either a periodic rewrite or resigned acceptance of trudging through large volumes of mud to get anything done.

Not my experience at all, and I've been working with React for years. I've seen companies successfully transition to functional components from class components, all while maintaining years-long functionality. Just because you dislike React doesn't mean it's not successful.


I never said it wasn't successful; I said it had successfully increased complexity.

I am glad you are an experienced React developer. React needs more of you, because the vast majority aren't.

That said, I have little doubt your and your team's experience could implement with most other frameworks as well—frameworks with a shallower learning curve, just as much power, greater performance before reaching for useMemo(…) equivalents, and far far less boilerplate code.

But 100% correct: I dislike React.


True, it does take more experience, but there are footguns in any language. Personally, having used Knockout, I see that signals are not the future, because I guarantee in 5 years, just as with Knockout and RxJS, there will be articles out on how signals create a spaghetti mess of code. So I'd rather take React verbosity if only because I know where the alternatives lead.

Now if there were a performant version of the core React philosophy like Preact is (or React with their upcoming Forget compiler) I'll gladly take it, but it seems that frontend libraries these days are trending in the wrong direction.


Well, I've yet to see any web project of that scale survive more than 2 years. What do you think would be a better stack with similar frontend interactivity?

(Genuine question, not being snarky)


I have Aurelia 1 apps that have been in production since 2015. They don't need to be really touched. But, when they do, they're really easy to modify. I am currently using Aurelia 2 and will have similar scalable apps that will be in production for years to come too.


Hadn't heard of this one, and checked it out just now. Seems interesting. Thanks for sharing!


I have seen Angular projects last pretty long avoiding code entropy. Angular is opinionated, rigid, and verbose, but goddamn if that structure and predictability don't pay off in the long term.


I'd love to investigate that, honestly. I loved Angular v1 (Angular.js now, I think?) and even that was way cleaner and more opinionated than React.

I haven't tried modern Angular in a while. But Next.js reminds me of a lot of the things I loved about early Angular and disliked about raw React (which always was more of a UI lib than a proper framework). Have you tried comparing them in particular?


My most recent use of Angular on a project is using v13 at the moment. I've been doing more infrastructure work lately, and Angular's build times—and bundle sizes—have been a persistent sore spot for me. I keep eyeing v16 jealously since it uses signals, builds A LOT faster, and removes some more boilerplate code in components, and the output assets are pleasantly smaller.

I haven't tried Next.js myself though a coworker was playing with it recently. The amount of components available for React, especially the for-pay libraries, can really decrease time to release. But it's still React. Builds a little slower than Angular 16 at this point, but it's fairly close. You still need to keep a firm hand on the codebase when working on a team to prevent code enmuddification, though less so than plain React. React's (what I consider) extreme amounts of boilerplate for doing anything of consequence is still there though. I also personally prefer Angular's service injection patterns over Reacts explicit imports everywhere.

I'd give them both up in a heartbeat for SvelteKit if I could make a business case for either rewrites or new projects. As it stands, Angular is really good enough and fights code entropy well enough that there isn't a business case there yet.


Cool. Thanks for the endorsement! Maybe it's time to build a hello world in modern Angular and SvelteKit and compare both to Next.


Would love to hear about your experience!


Why would server side rendering send the entire HTML page on every time? Or lose context or page position? All those have been smoothly handled by pure server side frameworks such as wicket or tapestry for over a decade.


What do you mean? Maybe I'm misunderstanding something, but Wicket itself says it uses Ajax components for client updates.

As far as I know, it's not possible to do client-side rerendering without Javascript.

Many frameworks these days do a hybrid approach of server and client-side rendering, sometimes with rehydration, sometimes not. But my understanding is that they all require some level of Javascript to redraw UIs on the client.

If I'm misunderstanding (or just plain wrong?), could you please elaborate?


That’s exactly what this approach (sever components) is for


One of these posts. Dig into the numbers and claims, and you'll see that they're not building something anywhere near Twitter scale.


A lot of the React fanatics are in denial about this. But, I've seen some absolute messes. Does anyone remember InVision? You probably have forgotten about them because Figma and other design tools ate their lunch, but part of their failure and delay in InVision Studio being released was in part because of React. I know other companies have encountered limitations with React and had to hack around them, most notably Atlassian who had to break their app up into apps (essentially parts of the app were smaller apps) because they wanted to use state management for everything and the memory problems were astronomical.

React got popular because at the time, Angular.js was the framework of choice and we can all agree Angular had some serious problems like the digest cycle and performance issues working with large collections. React in comparison to Angular was a breath of fresh air, but in 2023 there are far better choices now than React.

The thing is, React started out as just a view library and the community are mostly to blame for how terrible it has become. How many state management libraries have there been? How many router packages? Devs think they want a library, they actually want a framework. It's why the most popular uses of React are not vanilla React, they're frameworks like Next.js. Also, when you compare the performance of React to other libraries and frameworks, you realise it's one of the worse performing options. It's slow. The whole entire notion of Virtual DOM was groundbreaking in 2013, but in 2023, reactive binding approaches like those taken in Aurelia and Svelte are far superior and better performing. Virtual DOM is overhead.

Has anyone ever seen a large-scale React project that is just using vanilla React? I've seen a few large-scale React codebases and many of them were a tangled spaghetti mess. It gets to a point where React's lack of opinions and standards for even the most basic of things mean you can see the same thing implemented 100 times in the React community. If you're building something beyond a landing page or basic CRUD app, you need some conventions. There is a reason despite the hate it gets, Angular is still used in enterprise and government settings. It's verbose, but you write components and other facets of your apps a specific way and know that Developer A implementing a feature is going to be understood by Developer B because it's not going to be so self-opinionated. This is something that backend frameworks learned years ago.

Don't get me started on the terrible communication from the React team. A good example is how they have handled React Server Components. They changed the docs to recommend RSC's by default, despite the fact many community packages don't work with them or require additional packages. The way they approached RSC's and rushed them out was terrible. This isn't the first time either.

I have been working with Aurelia 2 (https://docs.aurelia.io) and I love it. It's intuitive, it's fast, the developer experience is great, it comes with all of the needed dependendencies you need like a router, validation, localisation and whatnot. No need to go building a faux-framework of Node packages bloating your app. I've also been working with Web Components, which are in a really good place now too and getting better with each proposal (there are a few good WC proposals in the works right now).

Developers need to start using the platform more. Web Components have been supported by all browsers since 2020 (Chrome has supported them since 2013), so it's a good time to dive in. Using something like Lit gives you a nice development experience. Part of the problem is React developers have been brainwashed into thinking classes are bad because of some terrible design decisions in React (which led to Hooks) and writing Web Components relies on using class syntax (although, you can write them as functions if you really want to).

React is anti-standards, it's anti-developer and it's bad for the platform. The fact that React is ten years old and doesn't support Web Components still, goes to show just how little the React team cares about Web Standards . I know jQuery is a bit of a meme now, but at least jQuery helped shape modern standards. Has anything good come from React for the platform?


I have worked with Aurelia 1, and I strongly recommend against using this framework. Over time, I have collected many gripes, but just off the top of my head:

- Arrays are observed by monkey-patching the push, pop, and other methods. There is no concept of assigning keys to cells [1]. So, if you want lists to be stable, you must manually diff the arrays and mutate the old one.

- In general, the observation system was awful. They have a custom JS subset interpreter for their templates, and they secretly add properties to observed objects. If all else fails, they will start up a 300ms timer that polls for changes.

- The binding system favors mutations over functional updates, but deep-observing objects isn't possible. So, if you want to observe a data structure for changes, you may need to write down each key.

- I encountered multiple bugs in the templating language. I don't remember the exact details, but they were similar to this: If you have a containerless element somewhere inside a conditional element, that's somewhere inside a repeated element, the containerless element isn't rendered.

- No type safety in their templates.

- No conditional slots, no dynamic slots, it's not possible to forward slots, can't detect if a slot is used.

- In my tests, performance was worse than React.

[1]: Unless you dig through GitHub issues and find an external contribution. However, it was broken for some time and doesn't follow the Aurelia conventions.


These are all Aurelia 1 concerns which have been fixed in Aurelia 2.

- There is no dirty-checking in Aurelia 2. The observation system now uses proxies and other sensible fall-back strategies. The computed decorator for getters is also gone in v2, meaning no accidental vectors for dirty-checking.

- Observation system was rebuilt to use many of the same strategies detailed in point one. No dirty checking and proxy-first. Similarly, your next point about mutations, also has been addressed by the new binding system.

- Many of the templating bugs people encountered were spec implementation issues due to how the browser interprets template tags and content inside them. There were a few repeater bugs, but the ones outside of non-spec compliance haven't been a problem in years and do not exist in Aurelia 2.

- You can write type-safe templates now.

- You have have conditional slots now if you use the new au-slot element. A lot of the slot limitations in Aurelia 1 were because Aurelia adhered to the Web Components spec for how slots worked. In v2 there is still slot, but a new au-slot has been introduced to allow you to do dynamic slots, spot replacement, detect if slots are defined or contain content.

It's important to realise Aurelia 1 was released in 2015, so it's not perfect and some design decisions reflected the state of the web and browser limitations at the time. Aurelia beat out React in a lot of benchmarks back in the day. I'm sure Aurelia 1 vs React has slipped, but Aurelia was one of the faster options for a while, especially in re-rendering performance. You should give v2 a look. It improves upon v1 in every single way.


Aurelia 2 has been in alpha for years and is now only in beta.

My point wasn't only that these issues exist. If a framework has this many issues that go unfixed for years, while the user-base dwindles, maybe you shouldn't trust the developers.

In addition, I believe that not only the implementation, but the fundamental design that is flawed. There's a reason why UI development has moved away from the OOP/Mutation/MVVM approach. The problems that hooks were intended to solve are real, and every big Framework since React has provided approaches to solve them... except Aurelia.


how is the transition from aurelia 1 to 2?

my app has actually very little aurelia specific code so i expect this should not be to hard. if only i can find an aurelia 2 equivalent of this version that works without any build system: https://news.ycombinator.com/item?id=36971080


The syntax and overall paradigm of Aurelia 2 is the same. The team avoided where possible a repeat of what Angular did to the community with the transition to Angular 2. Most notable differences are routing and dynamic composition.

There is a build system free version documented here. Is this what you mean? https://docs.aurelia.io/developer-guides/cheat-sheet#script-...

If you need any help porting it over, just let me know.


that looks interesting, but it doesn't seem to work.

https://unpkg.com/aurelia/dist/native-modules/index.js doesn't resolve. i tried https://unpkg.com/aurelia/dist/native-modules/index.mjs which does resolve, but it links to a dozen other files which all don't seem to resolve either. it looks like unpkg.com is rather broken.

with aurelia 1 there was a downloadable archive (which still exists) that had everything bundled that i could just unpack and it was ready to run. it didn't even need a server to host the files if i was running a browser on the same machine.

the original documentation for that is here: http://web.archive.org/web/20160903072827/http://aurelia.io/...

basically there are two things i am looking for:

i want to be able to develop the application without using browser-sync or build steps and it appears the version you linked promises that.

but i also want to be able to host and develop the application completely offline without any need for internet access.

the reason for that is that i am using the application (in production as it were) while i am developing new features or fix bugs that i discover while using it.

running the transpiler in the browser doesn't bother me, it's been fast enough so far.

i would use aurelia-cli if i could figure out how to make it build a development version without browser-sync and without transpiling and compressing the code before it is deployed.


is there any framework that works similar to aurelia but does this better?

i really like how aurelia is practically free of boilerplate code, and doesn't force my code and data into specific structures.


I've been working with Web Components a bit lately and was pleasantly surprised to see Lit had some similarities to Aurelia. Nothing really comes close to Aurelia, which is surprising given it has one of the better developer experiences.


> The fact that React is ten years old and doesn't support Web Components still, goes to show just how little the React team cares about Web Standards .

No one cares about web components, including people who originally where really bullish on them (Vue, Svelte, Solid). React supports web components just as it supports anything that attaches itself to the DOM, and that is more than enough for the vast majority of use cases.

Meanwhile, if Web Components were any good they wouldn't need another 20 specs just to barely patch holes in their design: https://w3c.github.io/webcomponents-cg/2022.html or still have unresolved issues like "custom button cannot submit a form": https://github.com/WICG/webcomponents/issues/814

> but at least jQuery helped shape modern standards. Has anything good come from React for the platform?

It could have, if the people behind web components listened to anyone except themselves.


The real solution is Web Components.


In my experience, ChatGPT and Copilot are just cutting out the middle step of a process developers have followed since the internet made problem solving more democratic. ChatGPT and Copilot are effectively StackOverflow, except faster. Right now, you still need to be an experienced developer to spot the hallucinations, but I see AI as a tool, not a replacement.

I remember developers were saying similar things about how jQuery was going to make developers lazy because they weren't writing manual queries and Javascript code (especially for things like XMLHttpRequest) and it just made developers more efficient. People said the same thing about Ruby on Rails, how it allowed developers to work with Ruby without having to really know Ruby. The same thing for database ORM's. Developers decried they would make developers lazy and not know how to write manual database queries.


A submarine made out of carbon fibre material built by a company that openly bragged about using off the shelf components to reduce costs, refused to hire domain experts and fired its director of marine operations for voicing concerns over safety, cutting corners? Say it ain't so. Despite James Cameron being known for his films, people forget he's been actively involved in the development of deep sea submersibles. He's been to the Titanic wreckage site more times than most. Paul-Henri Nargeolet who tragically died on this experimental sub has made 35 trips, Cameron has made 33.

It's not even an accusation at this point, it's the truth. And Cameron is more knowledgeable about this subject than most. The reality is deep sea submersible technology is still an underdeveloped field, evident by the fact few vessels exist that can take humans to such depths and even unmanned vehicles are far few and between.


Oceangate's 2019 blog post Why Isn't Titan Classed (now only available in internet caches), is a stunning display of either deeply flawed logical thinking or a willful attempt to confuse people. The post basically says, (1) the "vast majority of marine (and aviation) accidents are the result of operator error, not mechanical failure", and (2) the vehicle classification guidelines are too stringent and stymie innovation.

A rationale person might interpret that as: (1) mechanical-related incidents are very infrequent as a percentage of total incidents, because (2) vehicle guidelines successfully minimize rates of mechanical failure, such that remaining incidents are generally operational in nature.

Oceangate ignores this implication and bluffs its way from pointing out that most incidents are operational in nature (for a sample set of largely mechanically certified crafts) to implying that a focus on operational safety is a reasonable way to minimize total risk (for an uncertified craft).

ref: https://webcache.googleusercontent.com/search?q=cache:Y_6Rrx...


Boeing cut costs too, they bounced right back. That's the real lesson - you can absolutely get away with cutting corners, firing people who voice concerns, and then kill people. Just wait till the next news cycle and everyone forgets.


Boeing, a global leader for both sales and innovation in an industry that drives the modern world. Oceangate, a rebel "innovator" in a niche market for niche customers

also, the corollary why nobody has died on a voyage to Mars yet


Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: