I have alluded that one cause for the common reluctance to consider physicalism — in particular, that our minds can in principle be characterized entirely by physical states — is an asymmetry in how people perceive characterization. This can be alleviated by analogy to how our external senses can supervene on each other, and how abstract manipulations of those senses using recording, playback, and editing technologies have made such characterizations useful and intuitive.
We have numerous external senses, and at least one internal sense that people call "thinking" or "consciousness". In part because you and I can point our external senses at the same objects, collaborative science has done a great job characterizing them in terms of each other. The first thing is to realize the symmetry and non-triviality of this situation.
First, at a personal level: say you've never sensed a musical instrument in any way, and for the first time, in the dark, you hear a cello playing. Then later, you see the actual cello. You probably wouldn't immediately recognize these perceptions as being of the same physical object. But watching and listening to the cello playing at the same time would certainly help, and physically intervening yourself to see that you can change the pitch of the note by placing your fingers on the strings would be a deal breaker: you'd start thinking of that sound, that sight, and that tactile sense as all coming from one object "cello".
Before moving on, note how in these circumstances we don't conclude that "only sight is real" and that sound is merely a derivate of it, but simply that the two senses are related and can characterize each other, at least roughly speaking: when you see a cello, you know what sort of sounds to expect, and conversely.
Next, consider the more precise correspondence that collaborative science has provided, which follows a similar trend: in the theory of characterizing sound as logitudinal compression waves, first came recording, then playback, and finally editing. In fact, the first intelligible recording of a human voice, in 1860, was played back for the first time in 2008, using computers. So, suppose it's 1810, well before the invention of the phonoautograph, and you've just heard the first movement of Beethowen's 5th. Then later, I unsuggestively show you a high-res version of this picture, with zooming capabilities:
If you're really smart, and have a great memory, you might notice how the high and low amplitudes of that wave along the horizontal axis match up pretty well with your perception of how loud the music is at successive times. And if you zoom in, you might notice that finer bumps on the wave match up pretty well with times you heard higher notes. These connections would be much easier to make if you could watch and listen at the same time: that is, if you could see a phonautograph transcribing the sound of the concert to a written wave in real-time while you listen to it.
Even then, almost anyone in their right mind from 1810 would still be amazed that such an image, and the right interpretive mechanism — say, a computer with great software and really good headphones — is enough to perfectly reproduce the sound of that performance to two stationary ear canals, right down to the audible texture of horse-hair bows against catgut strings and every-so-politely restless audience members. They'd be even more amazed that fourier analysis on a single wave can separate out the individual instruments to be listened to individually at a decent precision.
But our modern experiences with audio recording, editing, and playback — the fact that we can control sound by playing back and manipulating abstract representations of sound waves — deeply internalizes our model of sound (if not hearing) as a "merely physical" phenomenon. Just as one person easily develops the notion of a single "cello" as they see, hear, and play cello at the same time, collaborative science has developed the notion of a single object or model called "physical reality" to have a clear meaning in terms of our external senses, because those are the ones we most easily collaborate with.
Now let's talk about "consciousness". Consider that you have experienced an inner sense of "consciousness", and you may be lucky enough to have seen functional magnetic resonance images of your own brain, or even luckier to watch them while they happen. These two senses, although they are as different as the sight and sound of a cello, are perceptions of the same object: "consciousness" is a word for sensing your mind from the inside, i.e. from actually being it, and "brain" is a word for the various ways of sensing it from the outside. It's not surprising that this will probably be that last of our senses to be usefully interpreted scientifically, because it's apparently very complicated, and the hardest one to collaborate with: although my eyes and yours can look at the same chair, our inner senses are always directed at different minds.
Under Descartes' influence, the language I'm using here is somewhat suggestive of dualism in its distiction between physical phenomena and our perceptions of them, but in fact it seems that some of our sensations simply are physical phenomena. Some combination of physical events — like air molecules hitting the eardrum, electro-chemical signals traversing the auditory nerves, and subsequent reactions in the brain — is the phenomenon of hearing. I'm not saying your experience of hearing doesn't happen, but that it is the same phenomenon as that described by physics and biology texts using equations and pictures of the auditory system, just as the sight and sound of a cello are direct descriptions of the same object "cello".
But when most people consider consciousness supervening on fundamental physics, they often end up in a state of mind that is better summarized as thinking "pictures of dots and waves are all that exists", without an explicit awareness that they're only thinking about the pictures. And this just isn't very helpful. A brain is not a picture of a brain any more than it is the experience of thinking; in fact, in stages of perception, it's much closer to latter, since a picture has to pass through the retina and optic nerve before you experience it, but the experience of thinking is the operation of your cerebral network.
Indeed, the right interpretative mechanism — for now, a living human body is the only one we've got — seems enough to produce to "you" the experience of "thinking" from specific configurations of cells, and hence particles, that can be represented (for the moment with low fidelity) by pictures like this:
In our progressive understanding of mind, this is analogous to the simultaneous-watching-and-listening phase of learning: we can watch pictures of our brains while we "listen" to our own thoughts and feelings. If at some point computers allow us to store, manipulate, and re-experience partial or complete mental states by directly interfacing with the brain, we'll be able to update our mind-is-brain model with the same sort of confidence as sound-is-longitudinal-compression-waves. Imagine intentionally thinking through the process of solving a math problem while a computer "records" your thoughts, then using some kind of component analysis to remove the "intention" from the recording (which may not be a separable component, I'm just speculating), and then playing it back into your brain in real-time so that you experience solving the problem without trying to do it.
Wouldn't you then begin to accept characterizing thoughts as brain states, like you characterize sounds as compression waves? A practical understanding like that — the level of abstract manipulation — would be a deal breaker for me. And naively, it seems no more alien than the complete supervienience of sound or smell on visual representations of it.
This isn't an argument that the physicalist conception of consciousness is true, but simply that it's not absurd, and follows an existing trend of identifications made by personal experiences and science. Then all you need is heaps and loads of existing evidence to update your non-zero prior belief to the point where you recognize it's got the best odds around. If they ever happen, future mind-state editing technologies could make "thoughts = brain states" feel as natural as playing a cello without constantly parsing "the sight of cello" and "the sound of cello" as separate objects.
Even as these abstract models become more precise and amenable than our intuitive introspective models, this won't ever mean thought "isn't real" or "doesn't happen", any more than sight, touch, or hearing "doesn't happen". You can touch, look at, and listen to a cello, yielding very different experiences of the exact same object. Likewise, if one demands a dualist perceptual description, when you think, you're "introspecting at" your brain. Although that's a very different experience from looking at an fMRI of your brain, or sticking your finger into an anaesthetized surgical opening in your skull, if modern science is right, these are experiences of the exact same physical object: one from the inside, two from the outside.
In short, consciousness is a sense, and predictive and interventional science isn't about ignoring our senses... it's about connecting them. Physics doesn't say you're not thinking, but it does connect and superveniently reduce what you experience as thinking to what you and everyone else can experience as looking at your brain.
It's just that awesome.