I never agreed with him completely on his views in this regard, but his main point is if you have a system that's governed by rules on input (Like someone looking up the answer in the Chinese room), there's no subjective understanding. So even if you had such a device(room) that behaved flawlessly, there's nothing in there that actually "understands" what's going on. The paper that the Chinese characters are written on doesn't understand, they're paper. And the person carrying out the instructions doesn't understand either, they're just following orders. So what's having the subjective experience/understanding?
The main point I took away is that he feels that consciences is an ordinary biological process, and a simulation of that process is not the same as the process. In the same way that a computer simulation of a stomach digesting food isn't the same as an actually stomach digesting food. No matter how good the simulation is, it doesn't actually digest food. So a simulation of consciences, isn't actually consciences, and doesn't have a personal subjective experience.
The main point I took away is that he feels that consciences is an ordinary biological process, and a simulation of that process is not the same as the process. In the same way that a computer simulation of a stomach digesting food isn't the same as an actually stomach digesting food. No matter how good the simulation is, it doesn't actually digest food. So a simulation of consciences, isn't actually consciences, and doesn't have a personal subjective experience.