
Currently, brain-computer interfaces are able to distinguish our thoughts used to give “orders” to brain-controlled device from the rest of our thinking and sensory input. Brain activity that's not a part of commands is interpreted as “noise”, and it's not used by BCI. As the technology evolves, BCIs will be more capable of deciphering the rest of our brain activity and recognize emotional state.
Thnik about a stressful day at work 20 years from now. All the workers in your company has been given a thought-controlled communication device, what used to be known as a smartphone. You are trying to send a report to your superior, about the project that is not going the way it's supposed to. At the same time, your working memory of the brain is flooded with informations like: your colleague who didn't do his task well, you don't like this work environment, you have two meetings waiting you today, you won't be home on time to pick up your kid from school, you remember other job offer you passed on last month, etc.. At the IT center of the company is where the main software system monitors and records all communication. It's very likely that human resources department would collect additional data such as emotional state and thoughts of it's employees as soon as specific brain patterns related to stress and dissatisfaction show up at the mainframe.
Will you be able to hide your “i hate my job” thought from your employers?!