Hacker News new | past | comments | ask | show | jobs | submit login

This is offtopic, but I need to ask this question and I fear I may loose the opportunity to field it to this audience of experts. I'm interested in two things and am wondering if this book, or something else, would be of use to me. Assume I have strong math background, and assume I understand the Maxwell equations. What book/resource provides a good understanding of the approximations (from first principles) needed to provide a reasonable simulation of say a (layered) circuit board and increasing frequency signals? How about if you went more complex and wanted to understand (and again simulate in software) say the behavior of 28nm CMOS circuits? Any help would be appreciated. This is more of a longer-term/backburner project for me, but I'd love to understand this aspect of hardware.



Have a look for resources on the finite-difference time-domain (FDTD) method for electromagnetic simulation. Also in use are the finite element method (FEM) or the method of moments (MOM). Wikipedia has a decent-ish rundown https://en.wikipedia.org/wiki/Computational_electromagnetics.

The Art of Electronics will not help you for simulation, it's a very good practical manual for analogue electronics design. It'll help you understand how a circuit works, but electrical engineers rarely need to solve Maxwell's equations when designing a circuit. It's not a bad idea to read through it however, I can't see many situations where you'd immediately dive into EM simulation without understanding how the circuit works first.


For the first, probably some textbook on SPICE simulation and/or upper-undergraduate EE texts, perhaps verging to RF engineering expertise. I used to need to worry about these things at My old job at MIT Lincoln Lab, but my boss said estimating true parasitic inductances and capacitances is way more of an art than a science in real systems. What you should look up and be famiar with are 'strip line' and 'micro strip', which take advantage of the layering and ground planes to get transmission lines with controlled characteristic impedance.

For the 2nd, you really need to get into solid state physics if you want to understand the details of CMOS, which entails some requisite knowledge of both quantum mechanics and statistical mechanics/thermodynamics. Maxwell's equations alone won't cut it.


I can only direct you to the simulation package we used at university, I have no idea if it is the best, but it was fairly good during the time of my studies.

http://www.sonnetsoftware.com/


Thanks everyone for the responses. I think it seems, for the depth I'm interested in, it'll require deep study on my part, instead of my spare brain cycles. Here's my thinking that prompted this, sorry for the brain dump and lack of links, these are startupish ideas.

In business you have either the forces of disintermediation/vertical-integration, or specialization. We've moved from "real men have foundries" to fabless, with consolidation of foundries. I think of ARM as a software company. I am a software person. The goal of software is to turn hardware X into X-as-a-service, à la Amazon AWS. Part of the compromise of XaaS is compromise. What you loose in specialization you gain in flexibility; you shift capex into opex and promote venture capital.

Okay, so what?

What is post ARM? Three are two opposing forces of modern compute. On the one hand you have Amazon with Annapurna, and Google with its custom switches, consolidating the datacenter into a service, a really big black box with a internet-facing API. On the other hand you have the free-wheeling world of IoT, with all the big players trying desperately to create walled gardens. My thesis: this won't work for IoT.

What does this mean for future hardware? Well, think of what it would mean to turn something into a XaaS. Consider the foundry as a VHDL/Verilog to silicon. Simplify the frontend and the backend, i.e. limit/streamline the HDL and the output geometry(TSSOP/BGA/etc), in order to increase yield. The software would look like mix-and-match core (ARM/MIPS/lowRISC), pick on-chip bus, pick SRAM, pick memory controller (hey, cool you're dropping all DRAM controller for several NAND controllers and a small on-chip flash), pick accelerators, ethernet, etc/etc. Of course this is essentially already the case in hardware, except for the still large upfront capital expense.

But consider something else for IoT, something for the future electrician/carpenter/plumber. That is, change the customer from the end user to experts in the trades or proficient DIYers. To do this a startup would specify the physical/mechanical/electrical/thermal/acoustic/etc properties of modules, be it li-fi, smart sockets, servos, etc and make software to mix-and-match them into devices by these trades experts to solve problems. For example, someone who installs blinds can put together a modules to build something that automatically controls the blinds. To make this work requires some serious cross-disciplinary thinking. The backend would have this modules fully openly specified with factories wherever in the world competing to build them. The frontend would be educating and marketing to tradespeople as a way to make their practices more lucrative and increase their fees. The goal would be this software with lots of pre-designed module combinations, and tips and tricks for "blinging" up your home. To begin, the startup would have to design and manufacture its own modules to show viability, and then drop out of the picture and earn money on royalties collected from being interoperable with their design software.

Yes, if only I had a few megabucks lying around... :)


We have gone fabless but this hasn't altered the capital equation as much for hardware as one might hope. Chips tend to become 'pad limited' and 'dissipation limited' so there is a limit on what can be done in a certain volume. Some of the more creative stuff isn't about chips, its about interconnects. Which gets us to #2.

The "Internet of Things" isn't really an Internet, the disruption is that everything is a network and yes, Sun had it correct when they said the network is the computer. What you have is a collection of agents which cooperate to achieve a commanded objective. No one cares about 'smart dust' what they care about is the transformative aspects of real time contextual data. The IoT is about creating adapters which convert ambient information into data that can be collectively consumed and processed by computers. A billion barometers on smartphones taking samples of the pressure where they are, combining that with a set of GPS coordinates and transforming ambient data (air pressure at a known point) into a consumable dataset. Which when observed over time can inform on larger processes such as weather fronts. None of that needs "new chips" but it can benefit from easier assembly of existing capabilities.


If you threw out an off topic post and actually put contact information in your profile, and I had something to add, I would email you and nobody here would care (although your off topic post might get downvoted into oblivion) but submitting an off topic post with no way to contact you is not very useful.


"with no way to contact you"

Why doesn't the "reply" button suffice? Other people may be interested in the answers.


I've made changes to my profile, I hope they're useful. I also voted you up to counter the downvote. Sorry, but it's true, others could be interested in what you have to say.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: