Hacker News new | past | comments | ask | show | jobs | submit login
The 50th anniversary of the launch of the Intel 4004 (wsj.com)
120 points by sonabinu on Nov 15, 2021 | hide | past | favorite | 52 comments




Ted Hoff (who designed the 8008, amongst other chips) worked at Atari Corporate Research in the early/mid 1980s, before Atari went under.

He had a small stash of 8008 processors. Every now and then he'd get a letter from someone absolutely desperate to find a replacement for a failed one, usually from some spooky defense-related organization. Whereupon Ted would pluck an 8008 out of the protective foam, carefully package it and mail it off.

(These days you'd probably do an in-circuit emulation or some FPGA magic).


I don’t think you can really say that ‘Hoff designed the 8008’. The architecture / ISA came from CTC and Federico Faggin actually designed the chip.

There may be an Intel document somewhere that says Hoff designed it but it’s probably the same situation as the 4004 where they wrote Faggin out of history after he left to found Zilog.


Slight correction - I mis-remembered and it was Hal Feeney who did the logic design (his initials appear on the chip) but Federico Faggin led the implementation team.

http://www.righto.com/2016/12/die-photos-and-analysis-of_24....


I'm going with Jack Kilby and Robert Noyce as giants who's shoulders we stand on. A particular generation of chip is just one stepping stone of the many along the way.

https://www.keranews.org/health-science-tech/2014-09-12/on-t...

Also, a quibble, it's Halon not Argon. Nobody sprays water around a room full of burning electrical equipment.

Halon is no longer being manufactured for greenhouse reasons (afaik.) Alternatives being worked on: https://www.firetrace.com/fire-protection-blog/why-is-halon-...


> Also, a quibble, it's Halon not Argon. Nobody sprays water around a room full of burning electrical equipment.

But... Argon isn't water? What do you mean? Argon is also used for fire extinguishing in datacenters, according to wikipedia:

https://en.wikipedia.org/wiki/File:Argon.jpg


I think he's saying that datacenters would not have water sprinklers and would instead use Halon fire extinguishers.

He's wrong and you/Wikipedia are right, BTW. Halons were banned in 1994, before datacenters became commonplace. Argon and other inert gas fire suppression systems (of which Halon is one, but a banned one) are used instead.


> Halons were banned in 1994, before datacenters became commonplace.

The article is talking about 1971, long before Halon was banned. It was commonplace to find Halon installations in computer rooms in the 1980s.


> Halons were banned in 1994, before datacenters became commonplace.

The 90s called and would like a word with you.


when that call is over, the 60's, 70's, and 80's are on hold.


Halon used to be the standard for data centers using gaseous fire extinguishers in the mainframe era. See: http://www.catb.org/jargon/html/D/dinosaur-pen.html

The op's a bit outdated, but not completely wrong. Even now a lot of data centers use various halon replacements rather than Argonite, though there are a number of different gaseous fire suppressants used in data centers these days.


The last data centre I worked in certainly had water sprinklers too.

The gaseous system was one-shot. If the fire continues or restarted, or another fire occurred before the system was recharged, water sprinklers would save the building.

Redundant systems. Of course, some data centres probably have redundant gaseous suppression too.


If I recall correctly Halons (bromomethanes) are ozone depleting, though they may also been a greenhouse gas I’m not sure.

The last data centre I worked in had an FM200 liquified has fire suppression system, that’s 1,1,1,2,3,3,3-Heptafluoropropane - so probably still a greenhouse gas to some extent due to the propane, but wouldn’t be a significant concern given they’re not routinely discharged.


It's being outserviced for the reasons you mention, although you can still replace bottles in existing systems with gas from the used market.


My bad, apparently argon is also used as a fire suppressant.


The real change was Intel shifting from manufacturing commodity RAM to creating microprocessors. The 4004 was a major factor. But it was the corporate strategy change, leading to 8008, 8088, etc which really drove the success of Intel and the microcomputer.


The 4004 is very interesting in just how mind-bogglingly odd it is. It's well worth reading the (very short) data sheet[1], or at least skim the available instructions.

I wrote a tiny 4004 program for one of the emulators out of curiosity, and it was a wild experience.

[1] https://web.archive.org/web/20110601032753/http://www.intel....


>Google is about to introduce a next-generation Tensor chip, updating the ones found in the company’s Pixel 6 phones. It is basically an artificial-intelligence accelerator in your pocket, allowing your phone to adapt to you—your own personal neural network to train.

Is this accurate? Everything I've read online only discusses the tensor chip for inference (running the neural network to make predictions), not training a new model for that specific device. For example this blog only discusses how they are making inference faster: https://ai.googleblog.com/2021/11/improved-on-device-ml-on-p....


Google announced Federated Learning about 4 years ago:

https://ai.googleblog.com/2017/04/federated-learning-collabo...

I work adjacent to some teams using it. It's not idle research speculation; there are consumer product teams actively commercializing it.

(IMHO - speaking only for myself and not my employer - Federated Learning will be the most important invention to come out of Google, and its primary beneficiary will be someone other than Google, in the same way that the Alto was the most important invention to come out of Xerox but made a lot of people other than Xerox rich. It lets you build machine-learned models off of billions of devices without having centralized infrastructure or data storage, which finally makes the convenience and functionality of modern consumer software compatible with decentralized architectures and user privacy.)


Federated learning is neat in some ways but I’m wondering how this solves anyone’s privacy concerns? As we saw with the FLoC controversy, people aren’t going to be any happier about collecting data via an opaque algorithm that they can’t audit.

I think it does avoid collecting data. The problem is that it doesn’t help for gaining trust, so only an already-trustworthy organization could use it.


The algorithm itself is public - the blog post has arXiv papers on it.

People have shown themselves very willing to trust algorithms that are opaque (as in difficult to understand) but auditable (by other people). Witness Bitcoin or Ethereum, where they will put their life savings into an algorithm. They're just not willing to trust organizations or people, where the organization can change the rules of the game at whim. (Even this has exceptions - ETH flourished over ETC despite doing a hard fork to rewrite history after the DAO hack.)


>> The algorithm itself is public - the blog post has arXiv papers on it.

This is misleading. The implementation (running on a phone) would not be and there is no way to verify.


Why not? The implication of this comment thread is that the implementation would not be Google's.


The software and hardware supply chains need to be trusted. This would require more transparency than we get from Apple or Google. It’s doable, but who is going to do it?

I don’t think people trusting cryptocurrency is much of an argument because cryptocurrency investors in general are a minority, and also, people do a lot of stupid things with cryptocurrency.


> made a lot of people other than Xerox rich

But this time Google should be able to peek into what other people do with its invention and get rich too, right?


Potentially, but that's not what killed Xerox (who had a decade head start on Microsoft and Apple and had full visibility on what they were doing in the marketplace for another decade before GUIs became commonplace).

They fell victim to the Innovator's Dilemma: when a new technology comes out that fundamentally rewrites the rules about who your customers are and how markets should be organized, existing market leaders cannot maintain their lead. That's because their market itself gets restructured - where previously one market with a dominant firm and set of existing customers used to exist, now oftentimes new markets with new dominant firms and a new set of customers exist. Basically it rewrites all the assumptions that the business is structured around.


I'm pretty sure TechTechPotato (whom I consider a reliable souece) said it would be used for training as well as inference but didn't mention use for specific devices.


who's probably known better as Dr Ian Cuttress


Hmmm. That's only 11 months after the first microprocessor -- Ray and Bill Holt's Central Air Data Computer for the F-14 -- flew for the first time. (Dec 1970)

[https://www.wired.com/story/secret-history-of-the-first-micr...]

The 4004 might have been first if Intel hadn't been sidetracked to consider a CPU chip for Datapoint. Then TI got involved.

[https://web.archive.org/web/20150512143915/http://thetrendyt...]


It’s not really a microprocessor if it needs several chips for the CPU. I’d agree with Ken Shirriff that it’s between TI and Intel.

https://spectrum.ieee.org/the-surprising-story-of-the-first-...


I’d argue that 1. It wasn’t a microprocessor, and 2. There is no point counting “secret hardware”, same way we don’t include never released prototypes.


Thought it was the 6502.


I was certain it was going to be the Pringle.


Sad to see another article that gets the roles of those involved in the creation of the 4004 confused. Wikipedia gets it almost right:

> The chip design, implemented with the MOS silicon gate technology, started in April 1970, and was created by Federico Faggin who led the project from beginning to completion in 1971. Marcian Hoff formulated and led the architectural proposal in 1969, and Masatoshi Shima contributed to the architecture and later to the logic design.

Need to add that Stan Mazor worked with both Hoff and Faggin on the design.


Does anyone have a source for the claim “ Most of the wealth created since 1971 is a result of Intel’s 4004 microprocessor”.


The article explains it's reasoning - most of the wealth created since 1971 has been in the finance, tech, and globalization sectors. Other sectors have been shrinking, but people in finance, tech, or globalization have been doing great. (I assume he's measuring "wealth" as in "net change in revenue", where this statement is roughly correct.) Those three sectors are essentially dependent upon the microprocessor - tech obviously so, but finance is basically the province of computer models these days, and globalization is enabled by the communication, data storage, and algorithmic improvements made possible by software.


The article title is not very accurate. The 4004 was an evolutionary dead-end. The later, unrelated, 8008 design was a much more important chip. But the 4004 was the first successful COTS microprocessor, which is why it is remembered.


Haven't read the article but my immediate thought is that it's kind of obvious if you simply replace it with "microprocessors". I don't know about the historical significance of the 4004 though.


The 4004 was the first commodity-off-the-shelf microprocessor.


I think you mean Commercial Off The Shelf - COTS


> Does anyone have a source for the claim “ Most of the wealth created since 1971 is a result of Intel’s 4004 microprocessor”.

If they attributed it to a Famous Important Person would it seem more credible to you?

It's just hyperbole.


I don't think it a hard fact statement, more of a "I believe this to be so, isn't this incredible!" type of statement.


The wood textured patch is a nice touch. More electronic components should feature wood paneling.


Thank you, WSJ with ad banners and paywalls is practically unusable with the Tor Browser.


We detached this comment from https://news.ycombinator.com/item?id=29222398.


Can't help but think this is a planted puff piece while Intel is being hammered by Apple.


I suspect they picked that chip because tomorrow is the 50th anniversary of its launch (Nov. 15, 1971).


This conspiracy goes back further than I'd feared.


The article makes that clear in its second sentence, and it makes a better title too, so I've moved it above.


You can help yourself by reading it as an inspiration from technologies that changed the world, regardless of what the current situation is. Computer chip and more fundamentally, the invention of transistor is to be reflected not through the lens of Competitor A vs. Competitor B, but through the appreciation of science, engineering, and all the people that made it happen.


You are being downvoted but it absolutely is a puff piece.


it doesn't really matter right? Intel are so screwed




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: