Sioux Falls Free Thinkers

"Persistence and determination alone are omnipotent!"

For all those with Open Minds!

An Open Mind by Megan Godtland

2020 Free Thinkers Stats

2020 All Website Stats

Do People Have Psychic Abilities?
Open-Minded Free Thinking at its Finest

From a lecture in the "Understanding the Mysteries of Human Behavior" course by Professor Mark Leary: Do People Have Psychic Abilities? Venture into the field of parapsychology—the study of anomalous psychic experiences such as extrasensory perception. As Professor Leary reveals what decades of fascinating research (including special approaches such as the ganzfeld and presentiment studies) have uncovered about this phenomena, decide for yourself whether psychic abilities are myth or reality.

Professor Mark Leary is Garonzik Family Professor of Psychology and Neuroscience at Duke University. He has received a Lifetime Career Award from the International Society for Self and Identity and is a fellow of the Association for Psychological Science, The American Psychological Association, and the Society for Personality and Social Psychology. The author of a dozen books on social motivation, emotion, and the ego, Professor Leary is among the world's top social and personality psychologists.

3-30-20 Machine translates brainwaves into sentences
Scientists have taken a step forward in their ability to decode what a person is saying just by looking at their brainwaves when they speak. They trained algorithms to transfer the brain patterns into sentences in real-time and with word error rates as low as 3%. Previously, these so-called "brain-machine interfaces" have had limited success in decoding neural activity. The study is published in the journal Nature Neuroscience. The earlier efforts in this area were only able decode fragments of spoken words or a small percentage of the words contained in particular phrases. Machine learning specialist Dr Joseph Makin from the University of California, San Francisco (UCSF), US, and colleagues tried to improve the accuracy. Four volunteers read sentences aloud while electrodes recorded their brain activity. The brain activity was fed into a computing system, which created a representation of regularly occurring features in that data. These patterns are likely to be related to repeated features of speech such as vowels, consonants or commands to parts of the mouth. Another part of the system decoded this representation word-by-word to form sentences. However, the authors freely admit the study's caveats. For example, the speech to be decoded was limited to 30-50 sentences. "Although we should like the decoder to learn and exploit the regularities of the language, it remains to show how many data would be required to expand from our tiny languages to a more general form of English," the researchers wrote in their Nature Neuroscience paper. But they add that the decoder is not simply classifying sentences based on their structure. They know this because its performance was improved by adding sentences to the training set that were not used in the tests themselves. The scientists say this proves that the machine interface is identifying single words, not just sentences. In principle, this means it could be possible to decode sentences never encountered in a training set. When the computer system was trained on brain activity and speech from one person before training on another volunteer, decoding results improved, suggesting that the technique may be transferable across people.

7-17-19 Elon Musk's plans for mind-controlled gadgets: what we know so far
Elon Musk’s brain-computer interface company Neuralink has finally broken its silence. Since the company was formed in 2016, it has kept its plans secret, but in a presentation on Tuesday night it showed off its vision and explained what the firm has done so far. At the event, the company unveiled a brain-computer interface – a technology that allows machines to read brain activity. Neuralink says its device will have about 3000 surgically implanted electrodes, each of which will be able monitor some 1000 neurons at a time. The electrodes will be embedded in around 100 extremely thin threads, between 4 and 6 micrometres wide, which is much less than the width of a hair. The threads collect the measurements from the electrodes and will then be gathered through a small incision behind the ear, where a chip will sit to analyse the results. The information will then be sent via bluetooth to a smartphone app. Neuralink says the interface could be used for everything from helping people with paralysis to control prostheses to allowing people to directly interact with artificial intelligence: “This is going to sound pretty weird, but achieve a sort of symbiosis with artificial intelligence,” said Musk at the event. At the moment, we rely on an interface with technology such as our laptops that is slowed by our fingers or our eyes. Inserting a chip into our brains to speed things up will be key to overcoming that, said Musk. There is still a long way to go. Many research groups are working on brain-computer interfaces and there has been some progress made in recent years.

7-17-19 Elon Musk reveals brain-hacking plans
NeuraLink, a company set up by Elon Musk to explore ways to connect the human brain to a computer interface, has applied to US regulators to start trialling its device on humans. DThe system has been tested on a monkey that was able to control a computer with its brain, according to Mr Musk. The firm said it wanted to focus on patients with severe neurological conditions. But ultimately Mr Musk envisions a future of "superhuman cognition". The device the firm has developed consists of a tiny probe containing more than 3,000 electrodes attached to flexible threads - thinner than a human hair - which can then monitor the activity of 1,000 neurons. The advantage of this system, according to the firm, is that it would be able to target very specific areas of the brain, which would make it surgically safer. It would also be able to analyse recordings using machine learning, which would then work out what type of stimulation to give a patient. NeuraLink did not explain how the system translated brain activity or how the device was able to stimulate brain cells. "It's not like suddenly we will have this incredible neural lace and will take over people's brains," Mr Musk said during his presentation. "It will take a long time." But he said, for those who choose it, the system would ultimately allow for "symbiosis with artificial intelligence".

5-15-19 Hearing device picks out right voice from a crowd by reading your mind
Sometimes it is hard to make out what people are saying in a noisy crowded environment. A device that reads your mind to work out which voices to amplify may be able to help. The experimental device can separate two or three different voices of equal loudness in real time. It can then work out which voice someone is trying to listen to from their brainwaves and amplify that voice. The device, created by Nima Mesgarani at Columbia University in New York, is a step towards creating smart hearing aids that solve the classic cocktail party problem — how to separate voices in a crowd. First, Mesgarani’s team worked on a system that could separate the voices of two or three people speaking into a single microphone at the same loudness. Several big companies like Google and Amazon have developed similar AI-based ways of doing this to improve voice assistants like Alexa. But these systems separate voices after people have finished speaking, Mesgarani says. His system works in real time, as people are speaking. Next, the team played recordings of people telling stories to three people who were in hospital with electrodes placed into their brains to monitor epileptic seizures. In 2012, Mesgarani showed that the brainwaves in a certain part of the auditory cortex can reveal which of several voices a person is focusing on. By monitoring the brainwaves of the three volunteers, the hearing device could tell which voice people were listening to and selectively amplify just that voice. When the volunteers were asked to switch attention to a different voice, the device could detect the shift and respond.

3-29-19 Can psychics speak to the dead?
A band of skeptics infiltrates audiences to prove that famed psychics are frauds, said journalist Jack Hitt. But sometimes the power of these ‘readings’ is hard to deny. When you’re setting up fake Facebook pages, it’s the little details that can mess things up. On a group computer call last winter, Susan Gerbic was going through her checklist of tips for her team’s latest sting operation—this one focused on infiltrating the audience of a psychic. It all started with maintaining their Facebook sock puppets—those fake online profiles. “American spellings, everyone!” she commanded her half-dozen international colleagues through the Skype crackle. Gerbic lives in Salinas, Calif., and while she is retired from the routine world of work, she has taken on a new job, as self-appointed guardian of Enlightenment Reason. She spends most of her days wrangling her far-flung group of Guerrilla Skeptics into a common cause, defending empirical truth online. This usually consists of editing and monitoring Wikipedia pages—a cat-herding task she says she’s uniquely qualified for. “I was a baby photographer,” she explained. “I ran a JCPenney portrait studio for 34 years.” Collectively, the group, which has swelled to 144 members, has researched, written, or revised almost 900 Wikipedia pages. Sure, they take on the classics, like debunking “spontaneous human combustion,” but many of their other pages have real-world impact. For instance, they straightened out a lot of grim hooey about the phony teen-suicide “blue whale game” challenge, and they have provided facts about the Burzynski Clinic, a theoretical treatment for cancer operating out of Houston. Most recently, Gerbic’s members have focused on what they call “grief vampires,” the kind of middlebrow psychics who profit by claiming to summon the dead in venues ranging from Motel 6 conference suites to wine vineyards. Lately, technology has changed the business of talking to the dead—a $2 billion industry according to one market analysis—and created new kinds of openings for psychics to lure customers, but also new ways for skeptics to flip that technology right back at them. (Webmaster's comment: Shows how they do it. Worth the read.)

2-2-19 Mind-controlled robot lets you weld metal without using your hands
Welding is going hands-free. A mind-controlled robot can weld metal together after receiving mental instructions from its operator. The person controlling the robot wears an electroencephalography (EEG) cap, which measures the brain’s electrical activity via the scalp. They then look at a screen that has several pre-selected metal seams for the robot to weld. Each option on the screen flickers in turn and the operator stares at their choice. When their chosen option flickers, it generates a specific electrical response in the brain detectable by the EEG. By matching the electrical responses to the timing of the options displayed, the robot can identify where the operator was looking and therefore which welding seam they want to proceed with. If the operator is happy the robot understands their choice, they push a button to kick the robotic welder into action. The system was created by Yao Li and Thenkurussi Kesavadas at the University of Illinois Urbana-Campaign. Robot welders are widely used in vehicle manufacturing, for increasing efficiency and reducing labour costs. But they can also pose a safety risk to human factory workers: in 2015, a Volkswagen employee was killed in a German factory when a robot grabbed and crushed him, although accidents like this are rare. The hands-free nature of the robot could help operators perform task from further away. The robot also has a built-in camera for detecting if it comes close too close to a person’s skin to try to avoid accidents. The next step will be to develop smart robots that can respond intuitively to the factory environment and adjust their movements based on the behavior of operators around them, says Li.

1-29-19 Mind-reading implant can decode what your ears are hearing
A device that can recognise what words people are hearing and reproduce them in a robot voice could take us further down the road to reading the minds of people who can’t speak. The technique used temporary electrodes placed in the brain to monitor people before surgery, but the aim is to make a permanent implant. So far the approach has only been able to decode simple words that people were listening to. But the researchers hope that with further development it will understand speech that people are thinking but not voicing. If successful it could help people who can’t speak because they are paralysed after a stroke, for instance, or have motor neuron disease. At the moment such people can generate computer speech by using a headset that places electrodes on the outside of their head. These can detect simple “flashes” of brain activity that lets users select letters on a screen in front of them, but communicating this way is slow. “Speech is much faster than we type,” says Nima Mesgarani at Columbia University’s Zuckerman Institute in New York. “We want to let people talk to their families again.” Mesgarani’s team is trying to use electrodes that connect directly with the brain. It’s potentially risky so they exploited the fact that people who need surgery for epilepsy often have electrodes put into or on the surface of their brain temporarily to find out where their seizures are coming from. They asked five people in hospital who had either of these kinds of electrodes in place for a few days to listen to recordings of sentences. Their brain activity was used to train the group’s artificial intelligence speech recognition software.

10-8-18 Three people had their brains wired together so they could play Tetris
Three people played a game of Tetris using brain-reading caps. This is the first time several people have collaborated through brain-to-brain communication. Using only their thoughts to communicate, three people wearing brain-reading caps worked together to play a game of Tetris. This is the first instance of more than two people collaborating through brain-to-brain communication. “Our experiment can be regarded as the first proof-of-concept demonstration that multiple human brains can consciously work together to solve a task that none of the brains individually could,” says Rajesh Rao at the University of Washington in Seattle. In the experiment, three people combined minds to play a game of Tetris. Two of the team were tasked with sending information about the state of the game to the third player who actually decides which moves to make. The two people sending information wear electroencephalography (EEG) caps to record the brain signals produced by their thoughts, and the receiver wears a transcranial magnetic stimulation (TMS) cap that delivers the information to their brain. The senders can see the entire game board, and so have to tell the receiver whether to rotate the block or not. The receiver only sees the current block, not the state of play. The senders communicate with the receiver by focusing their attention on one of two flashing LED lights on either side of a computer monitor, which had the words YES and NO on either side of the screen. Where they focus their attention can be extracted by processing signals from the visual cortex. To pass this information to the receiver, their occipital lobe – the brain’s visual processing centre – is stimulated magnetically when the senders thought ‘yes’, causing the receiver to see a flash of light.

9-26-18 Mind-reading devices can now access your thoughts and dreams using AI
We can now decode dreams and recreate images of faces people have seen, and everyone from Facebook to Elon Musk wants a piece of this mind reading reality. I FEEL like a cross between an Olympic swimmer and a cyborg. On my head is a bathing-cap-like hat dotted with electrodes, and a cable dangles behind me. David Ibanez and Marta Castellano, from the neuroscience company Starlab, look at me from across a table at their headquarters in Barcelona. As the sun beams in through two giant windows illuminating the plain white room where we sit, I am trying to hide my nerves, but wonder whether that is even possible while wearing a device like this. These may be humble surroundings, but Ibanez and Castellano are about to try to read my mind. For decades, neuroscientists have been trying to decipher what people are thinking from their brain activity. Now, thanks to an explosion in artificial intelligence, we can decipher patterns in brain scans that once just looked like meaningless squiggles. “Nobody dreamed that you could get to the content of thought like we’ve been able to in the past 10 years. It was considered science fiction,” says Marcel Just at Carnegie Mellon University in Pennsylvania. Researchers have already peered into the brain to recreate films people have watched and decoded dreams. Now the world’s biggest players in AI are racing to develop their own mind-reading capabilities. Last year, Facebook announced plans for a device to allow people to type using their thoughts. Microsoft, the US Defense Advanced Research Projects Agency and Tesla’s Elon Musk all have their own projects under way. This is no longer just a case of seeing parts of the brain light up on a screen, it is the first step towards the ultimate superpower. I had to give it a try.

9-19-18 A mind-reading headset lets people fly drones using their thoughts
A group of people learned how to pilot drones with their thoughts, using a headset that converts brain waves into flying instructions. I think, therefore I fly. Headsets that read brain waves are being used to fly drones, letting us control machines with just our thoughts. A team from the Indian Institute of Science in Bangalore trained 14 people to control a multirotor drone using commercially available EEG headsets, devices that use small electrodes to test the electrical activity in your brain. There have been other attempts to control multirotor drones using thought, but Subbaram Omkar, who led the research, believes the new system is accurate enough to control fixed wing drones – something which hasn’t been done before. Such aircrafts require more control because they move through the air continuously, whereas multicopter drones can hover, whilst awaiting a command. Omkar says other systems that translate brain activity into drone motion cannot perform quickly enough to control a high-speed vehicle. To pilot the drones, people were asked to imagine four physical movements without moving any actual body parts: moving their left or right hand, and moving their left or right fingers and elbow. This thought process activates the sensory-motor cortex, even if no actual body parts are moved. An algorithm read the pilot’s brain waves at 90 hertz – corresponding to gamma waves, which are thought to be associated with perception – and when a thought pattern was clear enough used it to steer the drone.

8-15-18 Why taking ayahuasca is like having a near-death experience
A psychedelic drug produces effects similar to near-death experiences. The finding suggests changes to brain activity may explain such paranormal phenomena. A psychedelic drug taken as part of the South American plant brew ayahuasca produces effects that are strikingly similar to near-death experiences, a study has found. That means that feelings of leaving the body or inner peace associated with life-threatening experiences may simply be explained by changes in how the brain works, and aren’t evidence of paranormal phenomena, say the researchers. DMT is the main psychoactive ingredient in ayahuasca, which is used as a sacrament by some indigenous peoples in the Amazon. People who take it commonly describe feeling that they transcend their body and enter another realm. Chris Timmerman and colleagues at Imperial College London gave DMT intravenously to 13 volunteers and asked them to fill in a questionnaire used to assess near-death experiences. In an earlier session, the same volunteers were given an intravenous placebo, and they did not know which session would involve the real drug. After the DMT session, all 13 participants got results on the questionnaire that met the criteria for a near-death experience. In particular, they reported feeling as though they entered an “unearthly environment”, feeling “incredible peace or pleasantness”, having heightened senses and a feeling of unity with the universe. When their responses were compared with responses from a set of 13 people who had reported actual near-death experiences, there were no statistically significant differences between the groups.

7-25-18 This mind-controlled robotic arm lets you do two things at once
Need a hand? Eight people have reliably used a mind-controlled robotic third arm to perform two tasks simultaneously. Eight people have reliably used a mind-controlled robotic third arm to do two things at once. The technology could be used give a helping hand when lifting heavy objects or for tasks that require more than two arms. Participants in the experiment had to learn to control a robotic arm using a brain-machine interface. The robotic arm was placed next to the participants, and they wore two electrodes on the outside of their head to capture brain activity. The arm was then calibrated to pick up on the differences in brain patterns when participants imagined the arm grasping and releasing a bottle. To test their skills, participants had perform two tasks simultaneously. The first was to hold and release a bottle using the robotic arm, and the second was to use their two real arms to move a ball around on a tray. The team found that eight out of the fifteen participants could reliably roll the ball to target points on the tray while grasping and releasing the bottle with the robotic arm, but the other seven struggled and were only successful about half the time. “The two groups didn’t differ by their ability to control the arm, but probably by their ability to concentrate on multiple tasks at once,” says Shuichi Nishio at the Advanced Telecommunications Research Institute in Kyoto, Japan. In other words, he believes people who were better at the task were simply better at multitasking.

7-10-18 My weekend in the desert trying to experience dream telepathy
When Rowan Hooper went to Arizona to explore the purpose of dreams, he found himself among “experts” in using dreams to talk to dead people and diagnose cancer. There’s still so much we don’t know about dreams. What shapes them? What is their true purpose? Wanting to understand such questions, I headed to the Arizona desert for the 35th annual International Dream Conference in Phoenix, last month, only to find myself having lunch with a psychoanalyst scheduled to give a talk on using dreams to predict the future. I was lucky not to choke on my burrito. The International Association for the Study of Dreams, which runs the conference, is “multidisciplinary”: it embraces both scientific and other modes of enquiry into dreaming. I was there to hear presentations from researchers such as Katja Valli, a cognitive neuroscientist at the University of Turku, Finland, who has proposed an evolutionary explanation for dreaming, and Mark Blagrove, a psychologist from the University of Swansea, UK, who is researching how waking events are incorporated into dreams. But I was also interested to learn what people outside the scientific fold are contributing to our understanding of dreams. Call me naive, but it was a shame to find an “us versus them” attitude among the non-scientists. “There are multiple ways of opening doors and sometimes hard science is the wrong key,” Fariba Bogzaran, of the California Institute of Integral Studies, told the meeting. You could argue that parapsychology – the study of telepathy and so on – is harmless fun, a mystical hobby in the vein of astrology. But conference sessions explored the use of dreaming for diagnosing breast and prostate cancer. When pseudoscience veers into the health realm, it risks misleading people with serious diseases, and unnecessarily worrying those who do not have cancer but may have dreamed that they do.

6-20-18 With this new system, robots can ‘read’ your mind
Directing bots with brain waves and muscle twitches could make for a speedier response time. Getting robots to do what we want would be a lot easier if they could read our minds. That sci-fi dream might not be so far off. With a new robot control system, a human can stop a bot from making a mistake and get the machine back on track using brain waves and simple hand gestures. People who oversee robots in factories, homes or hospitals could use this setup, to be presented at the Robotics: Science and Systems conference on June 28, to ensure bots operate safely and efficiently. Electrodes worn on the head and forearm allow a person to control the robot. The head-worn electrodes detect electrical signals called error-related potentials — which people’s brains unconsciously generate when they see someone goof up — and send an alert to the robot. When the robot receives an error signal, it stops what it is doing. The person can then make hand gestures — detected by arm-worn electrodes that monitor electrical muscle signals — to show the bot what it should do instead.

5-23-18 Scientists develop 'mind-reading' algorithm
Researchers are using data from recorded brain activity and software algorithms to generate images reconstructed from a person's memory.

5-2-18 AI can predict your personality just by how your eyes move
Shifty looks or wide pupils, our eyes give away clues to our personality - a discovery that could help robots better understand and interact with humans. The eyes really are a window to the soul. The way they move can reveal your personality type – a finding that could help robots better understand and interact with humans. Psychologists have long believed that personality influences the way we visually take in the world. Curious people tend to look around more and open-minded people gaze longer at abstract images, for example. Now, Tobias Loetscher at the University of South Australia and his colleagues have used machine learning – a type of artificial intelligence – to study the relationship between eye movements and personality more closely. They asked 42 students to wear eye-tracking smart glasses while they walked around campus and visited a shop – an activity designed to mimic everyday life. They also asked them to fill out questionnaires that rated them on the “big five” personality traits: neuroticism, extroversion, agreeableness, conscientiousness, and openness. Their machine learning algorithm found that certain patterns of eye movements were more common among different personalities. For example, neurotic people tended to blink faster, open-minded people had bigger side-to-side eye movements, and conscientious people had greater fluctuations in their pupil size. The reasons why are unclear, but this doesn’t matter if the goal is to teach robots to be more socially aware, says Loetscher. “They need to know which eye movements relate to which personality types, but they don’t need to know why,” he says. Future research may find that brain chemistry explains these patterns, says Olivia Carter at the University of Melbourne. There is already evidence that neurotransmitters like dopamine and noradrenaline influence personality and can affect eye movements like blink frequency and pupil dilation, she says.

5-2-18 This mind-reading hearing aid knows who you’re listening to
An ear mounted device with a battery of brain-scanning electrodes knows which sounds you're paying attention to – it might also help you get a good night's sleep PEOPLE who wear hearing aids can often struggle with the “cocktail party effect” – the inability of the brain to follow a single conversation in a room crowded with voices. Now a device that listens to your brain’s activity can help pinpoint exactly which voice you want to focus on. Most hearing aids use microphones to identify which voices are coming from in front of the wearer, and then amplifies these. But conversations don’t just happen face to face. So Florian Denk and his colleagues at the University of Oldenburg in Germany combined a hearing aid with a behind-the-ear device that can sense brainwaves. They were able to show the two could work together to amplify the sounds that a wearer was paying attention to, no matter which direction they were facing. The brainwave-sensing is carried out by a flexible C-shaped EEG device that wraps behind the ear. It uses 10 small electrodes to pinpoint electrical activity in the brain. The device samples both the wearer’s brainwaves and the audio signals in the room and can match the two together, indicating what the person is concentrating on. The device is still just a proof of concept and would have to be much smaller to be useful. At the moment it uses a matchbox-sized amplifier to boost the brain signals, which are then decoded on a desktop computer. But many high-end hearing aids now come with Bluetooth connections that link them to a smartphone, and it is possible that the decoding could be offloaded here, or even to a remote server.

4-16-18 Tune in your head? Mind-reading tech can guess how it sounds
We now have the ability to hear another person’s thoughts. Researchers have identified the brain activity involved in imagining sounds in your head. We now have the ability to hear another person’s thoughts. Researchers have identified the differences in brain activity linked to heard and imagined sounds, a finding that could lead to better communication devices for people who are fully paralysed. In 2014, Brian Pasley at the University of California, Berkeley, and his colleagues eavesdropped on a person’s internal monologue for the first time. They got several people who had electrodes implanted in their brain to read text out loud while having their brain activity recorded. The team used this data to work out which neurons reacted to particular aspects of speech, and created an algorithm to interpret this information. They were then able to analyse the brain activity of people who were imagining speaking, and translate this into digitally synthesised speech. But their algorithm wasn’t perfect – it could only translate brain activity into extremely crude aspects of speech, which weren’t often easy to understand. To get clearer translations, they needed a better understanding of how the brain activity responsible for imagined sound differs from activity associated with actually hearing a real sound. Distinguishing between these two types of brain activity is a challenge, because it’s difficult to know exactly when someone is imagining a specific word and measure the activity associated with this.

4-4-18 Mind-reading headset lets you Google just with your thoughts
A mind-reading device can answer questions in your head. It works by picking up signals sent from your brain when you think about saying something. SILENTLY think of a question and I will answer it. That might sound like a magic trick, but it is the promise of AlterEgo, a headset that lets you speak to a computer without ever uttering a sound. It’s not quite a mind reader, but it is close. The device brings us a step closer to a world where we can interact seamlessly with machines using only our thoughts. AlterEgo’s creators believe that rather than embarrassingly saying things like “OK Google” or “Hey Siri” and then dictating an email or ordering a pizza, eventually we will just think it instead. AlterEgo is far from perfect, but shows what may one day be possible. When you think about speaking, your brain sends signals to the muscles in your face, even if you don’t say anything aloud. The current AlterEgo prototype (pictured above) has a hook that slips over the right ear and sensors placed at seven key areas on the cheeks, jaw and chin. The sensors can eavesdrop on these speech-related signals, before artificial intelligence algorithms decipher their meaning. The device can currently recognise digits 0 to 9 and about a hundred words. AlterEgo is directly linked to a program that can query Google and then speak the answers back via built-in bone conduction headphones, which transmit the sound in a way that nobody else can hear. This means that the wearer can gain access to the world’s biggest information source using only their mind. “It gives you superpowers,” says Arnav Kapur, who created the device with Pattie Maes at the Massachusetts Institute of Technology Media Lab.

3-7-18 We need to be mindful as we develop thought-reading tech
Mass thought control may not be on the cards just yet, but mind-reading tech is developing fast. We need to be prepared. HOWEVER much technology knows about you – and you would be surprised how much it does (see “I exposed how firms and politicians can manipulate us online”) – there is one firewall that it can’t penetrate: your skull. Unless you choose to share your thoughts, they remain private. But for how much longer? Increasingly, a combination of brain scanning and artificial intelligence is opening the black box, gathering signals from deep inside the mind and reverse-engineering them to recreate thoughts. For now, the technology is limited to vision – working out what somebody is looking at from their brain activity (see “Mind-reading AI uses brain scans to guess what you’re looking at”) – but in principle there appears no reason why the entire contents of our minds couldn’t be revealed. This line of research inevitably raises fears about the ultimate invasion of privacy: mind reading. It is not difficult to imagine some sort of device that can simply be pointed at somebody’s head to extract their thoughts. Not difficult to imagine, but extremely difficult to realise. At present, the scanning part is done by functional magnetic resonance imaging (fMRI). This requires an extremely large and expensive piece of equipment and, crucially, the total consent of the brain being scanned. Subjects have to enter a narrow tunnel on a gurney with their head held perfectly still, and submit to a lengthy examination. On top of that, fMRI is still a fairly crude device for mind reading. It has been criticised for producing colourful cartoons of brain activity but not much actual data.

3-5-18 AI reconstructs whatever you see just by reading a brain scan
An algorithm can reconstruct pictures a person is looking at from brain scans, could one day be used to tell what someone is thinking. AI can pluck images directly from a person’s brain. Given an fMRI scan of someone looking at a picture, an algorithm can reconstruct the original picture from the scan. Though the results aren’t yet perfect, they are still often recognisable and hint at what may be possible in the future. Guohua Shen at Japan’s Advanced Telecommunications Research Institute and his colleagues experimented with three types of images: “natural” pictures of things like bats, snowmobiles and stained glass, artificial shapes like squares and plus signs, and alphabetical letters. The shapes and letters were identifiable, but the reconstructions of the natural images tended to be blurry and difficult to parse. Once improved, AIs like this might allow computers to know what we’re thinking about. “These decoding methods could be used for human-computer interaction in the future,” says Haiguang Wen at Purdue University in Indiana. “It is exciting that you could be able to know what a person is dreaming or thinking just by analyzing the brain signals.” Most neural networks of this sort have two steps: first they decode the data from your brain scan into a few specific features that the algorithm can understand, then they either reconstruct or identify the picture that those features represent. In order to do this, the network is trained on a pre-assembled set of images, sometimes using over a million pictures. Guohua Shen at Japan’s Advanced Telecommunications Research Institute and his colleagues experimented with three types of images: “natural” pictures of things like bats, snowmobiles and stained glass, artificial shapes like squares and plus signs, and alphabetical letters. The shapes and letters were identifiable, but the reconstructions of the natural images tended to be blurry and difficult to parse. Once improved, AIs like this might allow computers to know what we’re thinking about. “These decoding methods could be used for human-computer interaction in the future,” says Haiguang Wen at Purdue University in Indiana. “It is exciting that you could be able to know what a person is dreaming or thinking just by analyzing the brain signals.” (Webmaster's comment: Reading minds is not that far away.)

3-1-18 Mind-reading AI uses brain scans to guess what you’re looking at
An artificial intelligence can write a caption for a picture a person is looking at without seeing the original image, but looking at a brain scan instead. Can you guess what I’m looking at? Artificial intelligence can. A new system developed in Japan can describe a picture someone is viewing, using brain scans alone. Algorithms have recently become pretty good at generating image captions, however they normally get to see the images they are captioning. Now it seems the same techniques can be used to generate captions via scans of a person’s brain. “I consider it a form of mind reading, or perhaps at this point just mind skimming,” says Umut Güçlü at Radboud University in the Netherlands, who was not involved in the research. To generate a caption, the AI is given an image of a person’s brain, taken with an fMRI scanner while the person was looking at an image. The fMRI scanner shows the surges in blood flow that correspond with activity in the brain, so the different parts of the brain that process the image light up on the scan. From this, the AI then produces a caption based on what it thinks the person was viewing. For example, one caption it generated was “A dog is sitting on the floor in front of an open door”, which correctly described the scene. Working out the accuracy of the system is tricky because there is no definitive best caption. However, of the six captions included in the paper, most of them fairly accurately described the original image and made grammatical sense. “Some results were quite good, others were bad,” says Ichiro Kobayashi at Ochanomizu University in Japan.

2-28-18 DeepMind AI is learning to understand the ‘thoughts’ of others
The firm’s new artificial intelligence has developed a theory of mind, passing an important psychological assessment that most children only develop around age 4. MACHINES are getting to know each other better. An artificial intelligence, developed by Google-owned research firm DeepMind, can now pass an important psychological assessment that most children only develop the skills to pass at around age 4. Its aptitude in this key theory of mind test may lead to AIs that are more human-like. Most humans regularly think about other people’s desires, beliefs or intentions. For a long time, this was thought to be uniquely human, but an increasing body of somewhat controversial evidence suggests that some other animals, such as chimps, bonobos, orangutans and ravens may have theory of mind (see “What do you think?“). However, the idea that machines could share these abilities is normally reserved for sci-fi. DeepMind thinks otherwise. The firm created its latest AI with the intention of it developing a basic theory of mind. The AI is called Theory of Mind-net, or ToM-net for short. In a virtual world, ToM-net is able to not just predict what other AI agents will do, but also understand that they may hold false beliefs about the world. For humans, the idea that others can hold false beliefs seems very natural, especially if you follow politics closely, or read the comment section on news websites. However, humans don’t actually understand that other people can hold false beliefs until around age 4. “It’s a classic developmental stage for young children,” says Peter Stone at the University of Texas at Austin.

2-2-18 How to control a machine using your mind
Imagine being able to make a machine do your bidding with your thoughts alone, no button pressing, typing, screen tapping or fumbling with remote controls, just brain power. Well, this sci-fi scenario could be closer to reality than you think. Bill Kochevar's life was changed, seemingly irrevocably, when he was paralysed from the shoulders down following a cycling accident nearly a decade ago. His future looked bleak. But last year he was fitted with a brain-computer interface, or BCI, that enabled him to move his arm and hand for the first time in eight years. Sensors were implanted in his brain, then over a four-month period Mr Kochevar trained the system by thinking about specific movements, such as turning his wrist or gripping something. The sensors effectively learned which bits of the brain fired up - and in what sequence - for each movement. Then, when 36 muscle-stimulating electrodes were implanted into his arm and hand, he was able to control its movements simply by thinking about what he wanted to do. Within weeks, he could feed himself again. "This research has enhanced my ability to be able to do things," he told the BBC last year. Research teams are also working on mind-controlled wheelchairs, and on using sensors to allow people who are completely paralysed to give yes-and-no answers through the power of thought. But this technology doesn't just have health-related applications. Many tech companies are exploring brain control as a user interface. Recently, for example, car maker Nissan unveiled a "brain-to-vehicle" headset that monitors a driver's brainwaves to work out what you're about to do - before you do it. The aim of the system is to allow the vehicle to respond that split-second more quickly than a driver's natural reaction time.

10-30-17 Chill factors: The everyday things that make us see ghosts
Seeing ghosts is all too human, but what spooks us and why are some more susceptible? Surveys of "haunted" sites and gameplay are unmasking clues. I’M WANDERING the corridors of a derelict hospital. The place was abandoned following the mysterious disappearances of a woman in a coma, then several other patients. Strange noises have been reported coming from inside. Nobody knows what’s going on. It’s pretty spooky in here – dimly lit, with peeling paint and rusty doors. I screw up the courage to open one of them and – BAM! – a bloodied zombie girl leaps out at me. My heart starts racing. That zombie gets everyone, says Connor Lloyd at Buckinghamshire New University, UK. He should know. He designed the game and has all his players wired up so he can monitor their heart rate, breathing and sweating to find out what scares players. “I’m interested in how games affect people’s minds,” he says. But when Ciarán O’Keeffe, head of psychology at the university, came across the game, he realised it could do much more. O’Keeffe is now adapting it to study ghosts. Rationalists may scoff, but it’s only human to feel haunted. Many more people believe in ghosts and claim to have encountered one than you might suppose (see “Anyone for ghosts?”). “I think it’s quite arrogant of us to ignore these experiences and to say they’re all deluded,” says O’Keeffe, who is one of only a handful of researchers studying ghost sightings and supposedly haunted locations. Of course, he doesn’t believe ghosts are real. What he wants to know is why we get spooked. Over the years, researchers have singled out various physical, psychological and environmental factors. But debate continues about which ones are actually involved, how they create ghostly experiences and why some of us are more affected than others. An immersive game could be the best way to find answers.

6-23-17 Amputees control avatar by imagining moving their missing limbs
Amputees control avatar by imagining moving their missing limbs
Even after losing a limb, brain activity associated with imagined movements can be read by an fMRI brain scanner and used to control a computer character. Neuron activity associated with imagined movements could be used to control prosthetics. People who have had amputations can control a virtual avatar using their imagination alone, thanks to a system that uses a brain scanner. Brain-computer interfaces, which translate neuron activity into computer signals, have been advancing rapidly, raising hopes that such technology can help people overcome disabilities such as paralysis or lost limbs. But it has been unclear how well this might work for people who have had limbs removed some time ago, as the brain areas that previously controlled these may become less active or repurposed for other uses over time. Ori Cohen at IDC Herzliya, in Israel, and colleagues have developed a system that uses an fMRI brain scanner to read the brain signals associated with imagining a movement. To see if it can work a while after someone has had a limb removed, they recruited three volunteers who had had an arm removed between 18 months and two years earlier, and four people who have not had an amputation. While lying in the fMRI scanner, the volunteers were shown an avatar on a screen with a path ahead of it, and instructed to move the avatar along this path by imagining moving their feet to move forward, or their hands to turn left or right. The people who had had arm amputations were able to do this just as well with their missing hand as they were with their intact hand. Their overall performance on the task was almost as good as of those people who had not had an amputation.

5-19-17 A classic quantum test could reveal the limits of the human mind
A classic quantum test could reveal the limits of the human mind
Using human consciousness as the trigger in a test of ‘spooky action at a distance’ could tell us whether mind is made of different stuff than matter. The boundary between mind and matter could be tested using a new twist on a well-known experiment in quantum physics. Over the past two decades, a type of experiment known as a Bell test has confirmed the weirdness of quantum mechanics – specifically the “spooky action at a distance” that so bothered Einstein. Now, a theorist proposes a Bell test experiment using something unprecedented: human consciousness. If such an experiment showed deviations from quantum mechanics, it could provide the first hints that our minds are potentially immaterial. Spooky action at a distance was Einstein’s phrase for a quantum effect called entanglement. If two particles are entangled, then measuring the state of one particle seems to instantly influence the state of the other, even if they are light years apart. But any signal passing between them would have to travel faster than the speed of light, breaking the cosmic speed limit. To Einstein, this implied that quantum theory was incomplete, and that there was a deeper theory that could explain the particles’ behaviour without resorting to weird instantaneous influence. Some physicists have been trying to find this deeper theory ever since.

3-6-17 Humans control robots with their minds by watching for mistakes
Humans control robots with their minds by watching for mistakes
An EEG-based system uses the brain signals generated when we spot an error to correct an industrial robot’s movements as it works. Try again robot, you’re doing it wrong. A brain-computer interface lets people correct robots’ mistakes using the power of their thoughts. The system uses electroencephalography (EEG) to measure a person’s brain signals as they watch a robot work. When it detects a signal suggesting the person has witnessed a mistake, it alters the robot’s course. The system could be used to let humans control industrial robots simply by observing them. “We’re taking baby steps towards having machines learn about us, and having them adjust to what we think,” says Daniela Rus at the Massachusetts Institute of Technology. Rus and her team used an EEG headset to measure how the electrical signals in five volunteers’ brains responded as they watched a robot reach towards one of two LED lights. In each test, one LED was randomly selected as the “correct” one. If the volunteer saw that the robot was reaching for the wrong one, the headset detected this in their EEG readings and sent a signal to the robot, making it reach for the other. The robot used was Baxter, an industrial robot made by Rethink Robotics in Boston, Massachusetts. When we witness a mistake, we generate brain signals called “error potentials”, says Ricardo Chavarriaga at the Swiss Federal Institute of Technology in Lausanne. Error potentials have a distinctive shape, which makes them a good choice for controlling a robot, he says. In 70 percent of cases where the volunteers noticed that the robot was making a mistake, the system correctly recognised an error potential and altered the robot’s actions. The result was similar on a task where volunteers watched Baxter sort reels of wire and paint bottles into different boxes.

3-2-17 Empathy device lets a friend’s brain signals move your hand
Empathy device lets a friend’s brain signals move your hand
One person’s brain activity triggers hand gestures in another person in a muscle stimulation system aimed at communicating mood and encouraging empathy. If you’re happy and you know it, clap someone else’s hands. A muscle stimulation system aims to evoke empathy by triggering involuntary hand gestures in one person in response to mood changes in another. “If you’re moving in the same way as another person you might understand that person better,” says Max Pfeiffer at the University of Hannover in Germany. Pfeiffer and his team wired up four people to an EEG machine that measured changes in the electrical activity in their brain as they watched film clips intended to provoke three emotional responses: amusement, anger and sadness. These people were the “emotion senders”. Each sender was paired with an “emotion recipient” who wore electrodes on their arms that stimulated their muscles and caused their arms and hands to move when the mood of their partner changed. The gestures they made were based on American Sign Language for amusement, anger and sadness. To express amusement, volunteers had their muscles stimulated to raise one arm, to express anger they raised an arm and made a claw gesture, and to express sadness they slowly slid an arm down their chest. These resemble natural movements associated with the feelings, so the team hypothesised that they would evoke the relevant emotion. Asked to rate how well the gestures corresponded to the emotions, the volunteers largely matched the gestures to the correct mood.

12-14-16 We will soon be able to read minds and share our thoughts
We will soon be able to read minds and share our thoughts
EEG caps that monitor brain activity are allowing us to send thoughts to each other directly – a technology that could help people who are paralysed regain movement. The first true brain-to-brain communication in people could start next year, thanks to huge recent advances. Early attempts won’t quite resemble telepathy as we often imagine it. Our brains work in unique ways, and the way each of us thinks about a concept is influenced by our experiences and memories. This results in different patterns of brain activity, but if neuroscientists can learn one individual’s patterns, they may be able to trigger certain thoughts in that person’s brain. In theory, they could then use someone else’s brain activity to trigger these thoughts. So far, researchers have managed to get two people, sitting in different rooms, to play a game of 20 questions on a computer. The participants transmitted “yes” or “no” answers, thanks to EEG caps that monitored brain activity, with a technique called transcranial magnetic stimulation triggering an electrical current in the other person’s brain. By pushing this further, it may be possible to detect certain thought processes, and use them to influence those of another person, including the decisions they make. Another approach is for the brain activity of several individuals to be brought together on a single electronic device. This has been done in animals already. Three monkeys with brain implants have learned to think together, cooperating to control and move a robotic arm.

11-12-16 ‘I’m more confident’: Paralysed woman’s life after brain implant
‘I’m more confident’: Paralysed woman’s life after brain implant
HB, who has ALS, is the first person to use a brain implant at home. Using electrodes placed under her skull, she is able to play games and communicate. HB, who is paralysed by amyotrophic lateral sclerosis (ALS), has become the first woman to use a brain implant at home and in her daily life. She told New Scientist about her experiences using an eye-tracking device that takes about a minute to spell a word.

11-12-16 First home brain implant lets ‘locked-in’ woman communicate
First home brain implant lets ‘locked-in’ woman communicate
After training on whack-a-mole and Pong, a woman paralysed by ALS has become the first person to use a brain implant at home, communicating by thought alone. A paralysed woman has learned to use a brain implant to communicate by thought alone. It is the first time a brain–computer interface has been used at home in a person’s day-to-day life, without the need for doctors and engineers to recalibrate the device. “It’s special to be the first,” says HB, who is 58 years old and wishes to remain anonymous. She was diagnosed with amyotrophic lateral sclerosis (ALS) in 2008. The disease ravages nerve cells, leaving people unable to control their bodies. Within a couple of years of diagnosis, HB had lost the ability to breathe and required a ventilator. “She is almost completely locked in,” says Nick Ramsey at the Brain Center of University Medical Center Utrecht in the Netherlands. When Ramsey met her, the woman relied on an eye-tracking device to communicate. The device allows her to choose letters on a screen to spell out words, but may not work forever – one in three people with ALS lose the ability to move their eyes. However, teams around the world have been working to develop devices that are controlled directly by the brain to help people like HB.

10-26-16 Paralysed people inhabit distant robot bodies with thought alone
Paralysed people inhabit distant robot bodies with thought alone
Using a head-up display and a cap that reads brain activity, for the first time three people with spinal injury have controlled a robot and seen what it sees. IN THE 2009 Bruce Willis movie Surrogates, people live their lives by embodying themselves as robots. They meet people, go to work, even fall in love, all without leaving the comfort of their own home. Now, for the first time, three people with severe spinal injuries have taken the first steps towards that vision by controlling a robot thousands of kilometres away, using thought alone. The idea is that people with spinal injuries will be able to use robot bodies to interact with the world. It is part of the European Union-backed VERE project, which aims to dissolve the boundary between the human body and a surrogate, giving people the illusion that their surrogate is in fact their own body. In 2012, an international team went some way to achieving this by taking fMRI scans of the brains of volunteers while they thought about moving their hands or legs. The scanner measured changes in blood flow to the brain area responsible for such thoughts. An algorithm then passed these on as instructions to a robot. The volunteers could see what the robot was looking at via a head-mounted display. When they thought about moving their left or right hand, the robot moved 30 degrees to the left or right. Imagining moving their legs made the robot walk forward.

7-14-16 Who you gonna call? The real-life ghost hunters
Who you gonna call? The real-life ghost hunters
When staff at Italian restaurant Nido's were convinced they had a ghost running amok, they knew exactly who to call. They phoned a local, real-life team of ghost hunters. A few days later, a five-person crew from Dead of Night Paranormal Investigations arrived at the eatery in the US city of Frederick, Maryland, to investigate. After speaking to waiters and chefs, who said that an invisible presence would stomp up and down the stairs, the team set to work., which claims to be the world's largest directory of paranormal societies, shows that the US has more than 3,600 American groups listed. Meanwhile, the website lists 53 such organisations from Canada, and 57 from the UK. To become a paranormal investigator requires no formal qualifications. Nor are there any licensing requirements, and you don't even have to believe in ghosts. By not charging, investigation groups such as Dead of Night and East Coast Research say they can be as scrupulous and scientific as possible. Polls show that about a third of people in the US and the UK believe in ghosts. (Webmaster's comment: There are no Gods, Ghosts, Goblins, or Ghouls, but many humans do seem to want to believe in something from a non-existant supernatural world.)

6-1-16 Mind Melds And Brain Beams: The Dawn Of Brain-To-Brain Communication
Mind Melds And Brain Beams: The Dawn Of Brain-To-Brain Communication
Music students download the technique of their favorite pianist or singer directly into their brains. Medical students download the skills of a seasoned surgeon or diagnostician. And each one of us routinely uploads our thoughts and memories to the digital cloud. While these scenarios still lie in the future, rudimentary versions of the necessary brain-to-brain technology exist today. But the ability to directly influence another person’s brain raises serious questions about human rights and individual freedoms. This program will present the latest technology and explore how the ethical implications of enhanced thinking go to the heart of consciousness itself.

4-27-16 Map of the brain’s word filing system could help us read minds
Map of the brain’s word filing system could help us read minds
Brain scans show how words linked to specific concepts are stored in themed areas, giving us a way to peek at people's thoughts. Now Jack Gallant at the University of California, Berkeley, and his team have charted the “semantic system” of the human brain. The resulting map reveals that we organise words according to their deeper meaning, in subcategories based around numbers, places, and other common themes. Previous “mind-reading” studies have shown that certain parts of the brain respond to particular words. Gallant’s own lab had already found that the brain sorts visual information by meaningful categories like animals or buildings. In their latest experiment, the team wanted to see if they could build a more complete map of meaning across the cerebral cortex, the folded outer layer of grey matter.

4-15-16 We are zombies rewriting our mental history to feel in control
We are zombies rewriting our mental history to feel in control
Ever thought you have an uncanny knack of predicting events? It's probably down to shortcomings in the human brain. Bad news for believers in clairvoyance. Our brains appear to rewrite history so that the choices we make after an event seem to precede it. In other words, we add loops to our mental timeline that let us feel we can predict things that in reality have already happened. Adam Bear and Paul Bloom at Yale University conducted some simple tests on volunteers. In one experiment, subjects looked at white circles and silently guessed which one would turn red. Once one circle had changed colour, they reported whether or not they had predicted correctly. Over many trials, their reported accuracy was significantly better than the 20 per cent expected by chance, indicating that the volunteers either had psychic abilities or had unwittingly played a mental trick on themselves. (Webmaster's comment: Relying on verbal reports is not an objective measure. A bad test for anything, except maybe for how many unconsciously or deliberately are skewing results.)

4-13-16 Brain implant lets paralysed man move his hand with his thoughts
Brain implant lets paralysed man move his hand with his thoughts
Ian Burkhart is the first paralysed person to regain control of his own hand and fingers using a mind-reading device implanted in his brain. Ian Burkhart was 19 years old when he broke his neck diving into shallow water on holiday. Since then, he has been unable to move either of his legs, or his arms below the elbow (read Ian’s story here). Now, in a world first, he has regained control of one hand and his fingers using a mind-reading device. In the past few years, we have seen paralysed people walk again with the aid of exoskeletons, and by using recorded brain activity to trigger electric stimulations to the leg muscles. Others have trained paralysed people to control computer cursors and robotic limbs by thought alone. (Webmaster's comment: Mind-reading is here. Is it ESP or PSI? It's obviously close.)

3-2-16 I’m creating telepathy technology to get brains talking
I’m creating telepathy technology to get brains talking
Brain-to-brain communication is becoming a reality, says Andrea Stocco, who sees a future where minds meet to share ideas, or even to aid recovery from a stroke.
What have you achieved so far?
Most recently we have had two people in two different buildings play a version of 20 questions with each other using a brain-to-brain communication device. The volunteers did surprisingly well, guessing the right object 72 per cent of the time.
Some people have called this “mind-reading”. Would you agree with this description?
Not quite. In earlier work, my team developed a device that could tell when a person was thinking of moving their hand before they made any movement at all. This ability to detect an intention is more akin to mind-reading. But in our 20-questions set-up, we are only transmitting simple visual information.
(Webmaster's comment: Fascinating progress is being made, but you'll have to subscribe to the online version to read the article.)

2-9-16 Mind-reading tech helps beginners quickly learn to play Bach
Mind-reading tech helps beginners quickly learn to play Bach
A system involving brain sensors allowed a group of beginners to quickly learn to play a piano piece by Bach, and the tech could speed up other kinds of learning. Called BACh – for Brain Automated Chorales – the system helps beginners learn to play Bach chorales on piano by measuring how hard their brains are working. It only offers a new line of music to learn when the brain isn’t working too hard, avoiding information overload. Developed by Beste Yuksel and Robert Jacob of Tufts University in Massachusetts, BACh estimates the brain’s workload using functional Near-Infrared Spectroscopy (fNIRS), a technique that measures oxygen levels in the brain – in this case in the prefrontal cortex. A brain that’s working hard pulls in more oxygen. Sensors strapped to the player’s forehead talk to a computer, which delivers the new music. (Webmaster's comment: Our minds can be "read" by sensors. Why is some kind of "reading" of a human mind by another human mind so outlandish?)

12-11-15 The Mind Controlled UFO (Drone).
The Mind Controlled UFO (Drone).
This is the orb that uses your focused brain waves to remotely control its flight. An included headband and earlobe clip measures electrical activity produced by your brain (similar to EEG monitoring technology used by medical professionals). A downloaded app converts an iPhone or Android device into a remote control that pairs with the headband via Bluetooth. As you relax and concentrate, an included infrared transmitter connected to your smartphone’s audio port sends a wireless signal to the UFO. The app provides a control panel that allows you to adjust the throttle, yaw, and pitch thresholds of the UFO’s propellers, adjust the sensitivity of concentration, and filter background electromagnetic interference. The included USB cable charges both the infrared transmitter and the UFO from a computer connection. (Webmaster's comment: If we can send our thoughts through our skull to a headband and control a drone it is certainly conceivable that if two persons place their heads close together, that with practice, they could communicate simple images, thoughts and feelings. And that would be PSI or ESP wouldn't it? In fact isn't communicating thoughts to a headband to control a drone PSI?)

11-17-15 Frontal brain wrinkle linked to hallucinations
Frontal brain wrinkle linked to hallucinations
A study of 153 brain scans has linked a particular furrow, near the front of each hemisphere, to hallucinations in schizophrenia. This fold tends to be shorter in those patients who hallucinate, compared with those who do not. It is an area of the brain that appears to have a role in distinguishing real perceptions from imagined ones.

Let's get real. There is no supernatural anything. Nothing is outside of reality. There is no single God, and no Gods, no Ghosts, no Goblins and no Ghouls. They are all fabrications of the human mind in an effort to make sense of what we experience but don't understand. Our minds create an imagined reality or experience and we accept our imagining as something real. We seek a reason for existence and we just can't seem to accept that IT JUST IS.

Nonetheless there is strong evidence for anomalous psychic experiences such as extrasensory perception. That doesn't mean these experiences are outside of reality, that they are somehow supernatural. It just means we don't understand these experiences and cannot explain them YET. That's why we use science to study them. First to explicitly identify what it is that people are experiencing, and second to perform further experiments to understand how these experiences physically work.

Unfortunately many skeptical scientists see the study of extrasensory perception as a threat to science. They have already decided these experiences cannot be real. To protect science from the "charlatan" scientists performing these experiments they created a committee to set up rules and tests that the parapsychology research results must pass before they can be accepted as valid science. So the parapsychologists went back, designed tests that met the very strict rules required, and performed the tests again. Many of the tests still came out positive for extrasensory perception. OOPS.

So the skeptical scientists went back to the their drawing board and made the tests virtually impossible to pass for just about any research. One of the skeptical scientists actually quit the committee having realized this was not about making objective tests for parapsychology research to pass. It was about making tests that would not allow parapsychology research to pass PERIOD.

And that's where the parapsychology field stands to today. Parapsychology research is at a standstill for lack of funds and almost assured career destruction for anyone who dares study extrasensory perception. Science has lost a lot of creditability because of this issue. For this field of study science has not lived up to its own objectivity standards.

So look at the evidence presented here and as Professor Mark Leary says "decide for yourself whether psychic abilities are myth or reality." That will have to do for now.

(Webmaster's comment: Ignorant anti-science twits have attacked me for even suggesting that some form of "extrasensory" perception is even possible. Nevermind that technology already "reads" minds to control physical devices such as wheel chairs, computers and toys. These devices work by reading electrometric brain waves that are transmitted through our skulls and outside of our heads. To say that it's impossible for some other human brain to detect these electrometric waves is pure ignorant non-sense. And if they can be detected then it may be possible to interpret them, and Voila, ESP. Of course it is not "extrasensory." It is sensory and in the real world. But the effect certainly could still be real. Time and science will tell. Read the last book on the list "Extra Sensory" and see if you can get a copy of Mark Leary's lecture below. He's way more intelligent than those ignorant fools who attack science because it does not agree with their limited understanding of science and what might be possible. Hint: many of them are Atheists. Their fear of anything that might be construed as "extrasensory" blinds them to studying reality.

Total Page Views

Do People Have Psychic Abilities?
Free Thinking at its Finest