Ted Hoff (who designed the 8008, amongst other chips) worked at Atari Corporate Research in the early/mid 1980s, before Atari went under.
He had a small stash of 8008 processors. Every now and then he'd get a letter from someone absolutely desperate to find a replacement for a failed one, usually from some spooky defense-related organization. Whereupon Ted would pluck an 8008 out of the protective foam, carefully package it and mail it off.
(These days you'd probably do an in-circuit emulation or some FPGA magic).
I don’t think you can really say that ‘Hoff designed the 8008’. The architecture / ISA came from CTC and Federico Faggin actually designed the chip.
There may be an Intel document somewhere that says Hoff designed it but it’s probably the same situation as the 4004 where they wrote Faggin out of history after he left to found Zilog.
Slight correction - I mis-remembered and it was Hal Feeney who did the logic design (his initials appear on the chip) but Federico Faggin led the implementation team.
I'm going with Jack Kilby and Robert Noyce as giants who's shoulders we stand on. A particular generation of chip is just one stepping stone of the many along the way.
I think he's saying that datacenters would not have water sprinklers and would instead use Halon fire extinguishers.
He's wrong and you/Wikipedia are right, BTW. Halons were banned in 1994, before datacenters became commonplace. Argon and other inert gas fire suppression systems (of which Halon is one, but a banned one) are used instead.
The op's a bit outdated, but not completely wrong. Even now a lot of data centers use various halon replacements rather than Argonite, though there are a number of different gaseous fire suppressants used in data centers these days.
The last data centre I worked in certainly had water sprinklers too.
The gaseous system was one-shot. If the fire continues or restarted, or another fire occurred before the system was recharged, water sprinklers would save the building.
Redundant systems. Of course, some data centres probably have redundant gaseous suppression too.
If I recall correctly Halons (bromomethanes) are ozone depleting, though they may also been a greenhouse gas I’m not sure.
The last data centre I worked in had an FM200 liquified has fire suppression system, that’s 1,1,1,2,3,3,3-Heptafluoropropane - so probably still a greenhouse gas to some extent due to the propane, but wouldn’t be a significant concern given they’re not routinely discharged.
The real change was Intel shifting from manufacturing commodity RAM to creating microprocessors. The 4004 was a major factor. But it was the corporate strategy change, leading to 8008, 8088, etc which really drove the success of Intel and the microcomputer.
The 4004 is very interesting in just how mind-bogglingly odd it is. It's well worth reading the (very short) data sheet[1], or at least skim the available instructions.
I wrote a tiny 4004 program for one of the emulators out of curiosity, and it was a wild experience.
>Google is about to introduce a next-generation Tensor chip, updating the ones found in the company’s Pixel 6 phones. It is basically an artificial-intelligence accelerator in your pocket, allowing your phone to adapt to you—your own personal neural network to train.
Is this accurate? Everything I've read online only discusses the tensor chip for inference (running the neural network to make predictions), not training a new model for that specific device. For example this blog only discusses how they are making inference faster: https://ai.googleblog.com/2021/11/improved-on-device-ml-on-p....
I work adjacent to some teams using it. It's not idle research speculation; there are consumer product teams actively commercializing it.
(IMHO - speaking only for myself and not my employer - Federated Learning will be the most important invention to come out of Google, and its primary beneficiary will be someone other than Google, in the same way that the Alto was the most important invention to come out of Xerox but made a lot of people other than Xerox rich. It lets you build machine-learned models off of billions of devices without having centralized infrastructure or data storage, which finally makes the convenience and functionality of modern consumer software compatible with decentralized architectures and user privacy.)
Federated learning is neat in some ways but I’m wondering how this solves anyone’s privacy concerns? As we saw with the FLoC controversy, people aren’t going to be any happier about collecting data via an opaque algorithm that they can’t audit.
I think it does avoid collecting data. The problem is that it doesn’t help for gaining trust, so only an already-trustworthy organization could use it.
The algorithm itself is public - the blog post has arXiv papers on it.
People have shown themselves very willing to trust algorithms that are opaque (as in difficult to understand) but auditable (by other people). Witness Bitcoin or Ethereum, where they will put their life savings into an algorithm. They're just not willing to trust organizations or people, where the organization can change the rules of the game at whim. (Even this has exceptions - ETH flourished over ETC despite doing a hard fork to rewrite history after the DAO hack.)
The software and hardware supply chains need to be trusted. This would require more transparency than we get from Apple or Google. It’s doable, but who is going to do it?
I don’t think people trusting cryptocurrency is much of an argument because cryptocurrency investors in general are a minority, and also, people do a lot of stupid things with cryptocurrency.
Potentially, but that's not what killed Xerox (who had a decade head start on Microsoft and Apple and had full visibility on what they were doing in the marketplace for another decade before GUIs became commonplace).
They fell victim to the Innovator's Dilemma: when a new technology comes out that fundamentally rewrites the rules about who your customers are and how markets should be organized, existing market leaders cannot maintain their lead. That's because their market itself gets restructured - where previously one market with a dominant firm and set of existing customers used to exist, now oftentimes new markets with new dominant firms and a new set of customers exist. Basically it rewrites all the assumptions that the business is structured around.
I'm pretty sure TechTechPotato (whom I consider a reliable souece) said it would be used for training as well as inference but didn't mention use for specific devices.
Hmmm. That's only 11 months after the first microprocessor -- Ray and Bill Holt's Central Air Data Computer for the F-14 -- flew for the first time. (Dec 1970)
I’d argue that 1. It wasn’t a microprocessor, and 2. There is no point counting “secret hardware”, same way we don’t include never released prototypes.
Sad to see another article that gets the roles of those involved in the creation of the 4004 confused. Wikipedia gets it almost right:
> The chip design, implemented with the MOS silicon gate technology, started in April 1970, and was created by Federico Faggin who led the project from beginning to completion in 1971. Marcian Hoff formulated and led the architectural proposal in 1969, and Masatoshi Shima contributed to the architecture and later to the logic design.
Need to add that Stan Mazor worked with both Hoff and Faggin on the design.
The article explains it's reasoning - most of the wealth created since 1971 has been in the finance, tech, and globalization sectors. Other sectors have been shrinking, but people in finance, tech, or globalization have been doing great. (I assume he's measuring "wealth" as in "net change in revenue", where this statement is roughly correct.) Those three sectors are essentially dependent upon the microprocessor - tech obviously so, but finance is basically the province of computer models these days, and globalization is enabled by the communication, data storage, and algorithmic improvements made possible by software.
The article title is not very accurate. The 4004 was an evolutionary dead-end. The later, unrelated, 8008 design was a much more important chip. But the 4004 was the first successful COTS microprocessor, which is why it is remembered.
Haven't read the article but my immediate thought is that it's kind of obvious if you simply replace it with "microprocessors". I don't know about the historical significance of the 4004 though.
You can help yourself by reading it as an inspiration from technologies that changed the world, regardless of what the current situation is. Computer chip and more fundamentally, the invention of transistor is to be reflected not through the lens of Competitor A vs. Competitor B, but through the appreciation of science, engineering, and all the people that made it happen.