Mind Reading Computer Interfaces
The military is working on a plan to:
figure out how to monitor brain activity while it was happening -- and then have that affect a computer's display of information.
Roxanne characterizes this as “mind-fuck research”. And, let us be fair, when something is being funded by DARPA it isn't because someone wants to do happy warm fuzzies. Those DARPA boys are thinking about improved interfaces for killing machines:
So much of what’s done today in the military involves staring at a computer screen — parsing an intelligence report, keeping track of fellow soldiers, flying a drone airplane — that it can quickly lead to information overload.
This research is about war.
Roxanne asks, “Can you think of any commercial applications?” I can, easily, in everything from surgery to fast-food. A computer interface that adapts, automatically, to my needs even as I conceive them? Yes, there will be commercial applications in just about anything where a person needs to manipulate a large amount of data quickly. This will have some predictable effects that aren't much better than the military applications – it will destroy jobs. As businesses become more streamlined, they'll need fewer and fewer people to do them. It is part of the process that will cause us, some day, to have to confront the post-labor society – very few people will be needed to do a great number of things, and, soon, perhaps none at all.
It will also have truly wide-ranging law enforcement and privacy concerns, as are already happening with brain fingerprinting. A machine that reads a person's mind very obviously has tremendous potential for abuse, but also tremendous potential to stop abuse. No more lying on the stand, right? Hook someone into an augmented cognition device and power it up. It has the potential for effective and certain lie detection.
Hooked up to unwilling participants, it probably also has tremendous use as an interrogation tool – not only in the legitimate sense of catching murderers and the like, but in the illegitimate sense of tracking down enemies of the state or things that are “wrong” but not illegal (such as extra-marital affairs).
Or psychology! A machine that can give real insight into what a person thinks and feels. The ability to look into a person's mind – even partially – will be of great use to psychologists and psychiatrists and could easily lead to very real advances in those fields. Human minds won't be opaque, or at least not as opaque, where the only way “in” is through the clumsy medium of language, often when a person is being hindered in honesty by the same trauma that brings them to a psychologist or psychiatrist in the first place.
On top of that, the huge privacy concerns. There will be this machine and it will be reading your mind. Data theft is an issue with computers now. A machine that can honestly figure out pieces of your consciousness, and manipulates them, will be learning an awful lot about you. Just how much is uncertain, but in the fashion of technological progress it will tend to rapidly increase. After all, the more the computer knows about your wants and needs (even those you keep from yourself) the better it will serve you. The computer will learn things about you that no one else knows, in order to serve you better. To properly adapt will require accurate and useful information about you in all sorts of ways that, initially, we might find creepy. F'rex, what if a person works best while sexually aroused?
For me, the more exciting implications are in consciousness expansion. The computer will know things about us, real things, and personal ones. That data could be shared and studied. After long association with a person that has had a lot of trauma, there will be data about what a traumatized person wants and needs. That data could be shown to others, who then might take understanding about trauma from it.
It will, I think, also show us a lot of trauma that is now hidden. Those conservative fundie Christian workaholic cheerleaders for global corporate imperialism – we'll have access to what motivates them, too. Which will help everyone understand what is going on. Very exciting stuff.
Additionally, if I haven't gone on long enough about this subject, one of the key barriers to a society of consent and not violence is clear and concise data about relevant issues in a person's life. That's the whole point behind enhanced cognition as DARPA envisions it. We are overloaded with data. This is not, precisely, unknown to netizens. This sort of data interface would allow people to get concise information based on their wants and needs, handled in a way that maximizes the efficiency of communication between the computer and the user. And, of course, that computer is connected to the world through the Internet, so it becomes an increasingly efficient interface between users on the Internet. We will be able to share, amongst ourselves, information of increasing complexity and precision more efficiently than ever before. Blogs, newspapers, television will seem ridiculously crude in comparison to this ever updated, every personalized, ever concise and efficient data stream.
And because the more a person uses the system the more it adapts to them in a very personal way, and because these systems are connected, we will gain increasingly large insights into the people who use them. It isn't just that research groups will be able to study this data and talk about what is in our minds, or psychologists use it for therapy, or police for interrogations, but we will be able to use this information to see very personal, very private things about each other – if we are sufficiently brave. (F'rex, do you want your significant other to know that you work best sexually aroused? To know that working makes you sexually aroused?) These systems will, in some ways, mirror our consciousness (at least as when we use them) and that data can be experienced by others – through the same systems.
(Can you imagine what instant messaging would be with something like this? Where the system would try to express your real intentions to the person you're messaging in an adaptive way that increases the depth of conversation? That, to some extent, the enhanced cognition will help them feel what you are feeling? And that the system will know when it has succeeded? Can you imagine a world of increasing certainty in conversation? Where people can feel as you feel?)
People might think I'm being pretty over-the-top with this sort of thing. I don't think so. Sure, initially, it's just going to be a handless interface used in jet fighters or killer robots. But the plans are far more ambitious and we're far closer to realizing them than, I think, most people want to admit. Because what is being discussed is an adaptive control interface that is almost directly between our brain and a computer. To do this, the computer will become a practical learning tool about the consciousness of it's users, connecting us to each other with greater precision and efficiency than ever before. To use this sort of thing on weapons is callow and crude, much like the use of the Internet for weapons was callow and crude and has come to the front as a system of interpersonal communications – people talking horizontally to each other, rather than vertically through tightly hierarchical systems. This will not only expand our horizons, but the depths of communication in magnificent ways. If they succeed – and, eventually, they will – the military will lose control of this almost instantly.