Raspberry Pi meets Cognitive Neuroscience

Can the Raspberri Pi be used to process 4d neuroimaging data?

Can the Raspberri Pi be used to process 4d neuroimaging data?

When I started my masters degree, I didn’t entirely know what I was taking on. I chose to study cognitive neuroscience because I knew this was an area which presently receives a lot of funding for PhD research, and also because it seemed like a robust, scientific approach to psychology. After a few short months I have come to discover that research in this area is something that I really enjoy, and that is largely down to the opportunities to program and develop computerized research tools.

At the same time, the Raspberry Pi Foundation have announced and launched the new Raspberry Pi 2: a credit card sized motherboard which can be run as a standalone personal computer. I’m not going to go in to all the ins and outs of the Raspberry Pi (or Pi, for short), you can find more about it by visiting their website at www.raspberrypi.org. Instead, I want to talk about what it brings to the study of psychology.

Anybody who has studied psychology as a science will know that a lot of research is made up from undergraduate students sitting in dark rooms performing mundane tasks while their reaction times were being measured. This has been terrific for the creators of Matlab, whose IDE has facilitated many a psychology-button-pressing extravaganza. Even so, there is a new kid on the block who is gradually gathering momentum, and that is python.

One of the modules we undertook this spring semester was programming in python, something I hadn’t done before. I was keen to do at least some of this on the Pi, so as to justify my impulsive buying of it. My lecturer found kind of cute (that would be “cute”, except that he didn’t say it aloud). I persisted non the less, until I found that at least for graphics the Pi weren’t altogether compatible with the PsychoPy module we were using. Unfortunately it does not support Open GL graphics, so it actually struggles with button pressing experiments. But it was a start, it wet my appetite, and it was fun.

The real fun has commenced as I’ve begun my dissertation project. We’re doing some work on spatial cognition (how space is represented in the brain) using functional magnetic resonance imaging (fMRI), and part of this involves calculating different test statistics on 3d data on a voxel by voxel basis (voxel = volumetric pixel). There is lots of python modules optimized for flattening, analyzing and reassembling these datasets and we’re using them to build novel analyses which haven’t been done before. On this frontier of neuroimaging research, the raspberry pi stands gallant as my building platform and testing station to produce these scripts.

I’ve been using the module ‘minepy‘ to calculate the Maximal Information Coefficient (MIC) for each voxel in a set of datasets. The modules install seamlessly on the pi from the different repositories, and scripts can be written elegantly through the ‘spyder‘ IDE. I’m most excited most of all about the sheer size of the data we are working with. Each dataset is 64x64x26 voxels, which means 106496 calculations. On the Pi, each one takes about a second, and (when I close the GUI) all in all it takes about 2 hours. For me to run this on all 48 of our scans, that would take about 4 days. For processing the whole lot, fortunately we can pass our script to the cluster computer at York, which (assuming it is coded correctly) should polish it all up in around 16 hours.

It looks like something out of a movie, but its real!

It looks like something out of a movie, but its real!

What I really like about this is how real it all is. I’ve taken on many little projects here and there over my years, but none of them have ever really meant anything. Sure, I learned a lot, but I wanted to put it into action. Here, when those lines of text fly up my screen for 2 hours, I look forward to the output for reasons more than just knowing it worked. It also gives me the opportunity to work on my projects at home using a linux software environment, which great since Python runs natively in Linux.

This also ticks another box for the raspberry pi foundation. Universities and hospitals around the world have powerful workstations and supercomputers which they use to process neuroimaging data. That is very sensible, considering the sheer volumes they work with. But can it also be done on a £25 printed circuit board in Billy-whizz’s basement? Yes… It can!


Do Conspiracy Theories Constitute a Psychopathic Disorder?

They’re a topic that have received relatively little attention throughout the advancement of psychological research. Yet conspiracy theories have a large impact on our involvement with society. We may only turn a tolerant scientific smile to those who tell us that 9/11 was plotted by the US government, but what about theories that our politicians are trying to engineer the decline of different systems to achieve their hidden motives? Such topics often grab our attention far more powerfully, and not always for bad reasons.

Early work by Hofstadter (1971) argued that conspiracy theories developed from people feeling incapable of bringing about change through social or political action. Because of this, some researchers even suggest that producing conspiracy theories is symptomatic of psycho-pathological behavior (Swami and Coles, 2010). Much of the current research does suggest that the underlying cause of conspiracy theory circulation is related to negative aspects of peoples lives.

The sharing of conspiracy theories have been found to change opinions, arouse anger and cause people to become more apathetic within the political system (Butler et al, 1995). It has been suggested that they are used as a means of justifying outraged and distressed emotions (Festinger, 1957), or that they manipulate emotions to further their message (Sunstein and Vermeuhle, 2009) and that their use often causes people to take a biased view towards evidence (McHoskey, 1995). All these sources, as cited by Swami and Coles (2010) present research that implies the promulgation of conspiracy theories is abnormal behavior.

But I don’t think that’s fair. As I said above, we may think it a bit odd, that people are making claims like that princess Diana was killed on purpose. But what about on a smaller scale? What if I say that university tuition fee’s have been raised to attack the poor? Or that the capacity of the A65 has been reduced, thus increasing traffic so as to justify gaining funding to build a rail connection to the local airport. Am I spreading an emotion fuelled message without solid evidence because I feel unable to make my voice heard through the proper systems? Yes, I am.

Look at how I changed the wording though. Swami and Coles used the word “incapacity”, while I just used the word “unable”. That is because I don’t believe that I am not physically capable of raising my opinion, I just feel like it isn’t listened to. And that’s what Clark (2002) argues, that they demand greater transparency from government behavior. In some ways, they are a force to unite people in campaigning for much needed changes (Sasson, 1995).  This is exemplified by evidence that has come forward in recent years surrounding attempts by the US government to provoke war with Cuba in 1962 (Swami and Coles, 2010, see also Ruppe, 2001)

While certain conspiracy theories can be highly unethical, such as those that discriminate against different races or socio-economic class, the manor of pathologising a behavior that we all exhibit in some form makes me skeptical. The research carried out on this topic is largely subjective, with no common construct upon which to determine if this behavior is healthy or not. We may choose to frown upon certain theories, while embracing others, yet the above evidence suggests this freedom of thought creates an open channel for democratic action, allowing us to put to use our powers of reason that separate us from all other species.


Ruppe, D. (2001). U.S. Military Wanted to Provoke War With Cuba. ABC News, Retrieved from http://abcnews.go.com/US/story?id=92662&page=1#.Tz6I8vFmLmM

Other references as cited by:

Swami, V. & Coles, R. (2010). The truth is out there. The Psychologist, 23(7), 560-563.



Can God be found with an fMRI scanner?

tower of babel

The Tower of Babel: Can mankind really use its finite knowledge and ability to discover God?

Is there a God or not? It’s a question that a lot of people wonder about, and also a question that evokes controversy. The early generations of the old testament (the origin of all three major monotheism’s) built the tower of Babel, to try to get to heaven and find God. Now they’re trying it again, but with an fMRI scanner.

Joking aside, here I am examining research which intended to identify the neural correlates of spiritual activity in Carmelite nuns (Beauregard & Paquette, 2006). The proposal was that the memory of an activity in which the subject was “at one” with another person and the memory of a time when the subject was “at one” with God, would identify different areas of the brain involved in these respective activities.

A repeated measures experiment involved these nuns reliving different memories, and then having brain activity recorded using functional magnetic resonance imaging (fMRI) scanning. The study uses the core assumptions of the biological perspective, that if activity takes place in the brain, some type of processing is taking place, which can be correlated to behaviors.

It is the variables of this study which I question.When studying matters over which we have more grasp, purely physical affairs, we can draw conclusions regarding neural correlates. However when studying God, a being of whom science itself has little understanding of, how can we use mapping of neural correlates, to study communication with an entity to whom we know little (and quite possibly nothing) about?

If we are to identify the neural correlates of a religous, spiritual or mystical experience, one of the variables of the experiment must be whether or not there is a religious, spiritual or mystical power with whom to communicate. Without controlling this variable, there are too many assumptions.

The researchers state both that these activations within the brain may reflect the “impression that something greater than the subjects seemed to absorb them” yet also that the “external reality of God [cannot] be confirmed or disconfirmed” by identifying these neural correlates.

Therefore, the claims that these areas of the brain “mediate” spiritual experiences, are not supported by the neural correlates of just the recall of a memory. Additionally, brain activity (the dependent variable) was measured over recall of a memory. Measuring the brain activity when a recalling a memory, can only measure with certainly the brain activation for recall of that memory. It does not imply causation; that these are the neural correlates for a mystical experience.

Last of all, to examine how these findings were portrayed by the media. The Telegraph published them with the headline “Nuns prove God is not figment of the mind” (Highfield, 2006). This statement portrays an opinion related to the research, but not the actual conclusion drawn. Neural correlates were established, but some may feel just in drawing the opposite conclusion. Statements were used to support the writers opinion, which were not put forward by the original research, and that carry the conclusions drawn by the original researchers to the reader in a misleading manor.


Beauregard, M., & Paquette, V. (2006) Neural correlates of a mystical experience in Carmelite nuns. Neuroscience Letters, 405, 186-190.

Highfield, R. (2006, August 30). Nuns prove God is not figment of the mind. The Telegraph.

Please note: As the writier, I would like to make it known that I do have a firm belief in God which is precious to me. I believe in an objective truth, and that in areas of perfect understanding, science and religion are synonymous.

Was it Freud?

Today I speak of the Psychodynamic approach to psychology. It may not be regarded as empirical, the idea of subconscious drives and the like, but when I hear that being said, I can’t help but feel it has something to it. Do you have full control over your thoughts? Has anything ever ‘just’ popped in to your mind that you did not at all summon? There is definately something beyond our consciousness that has a say in the content of our minds.

So, along came Freud back at the turn of the 19th century, and made some sensational suggestions, to which science gave a tolerant smile before moving on to behaviorism, the cognitive approach and others. But recently, a new approach has been constructed towards these psycho dynamic principles.

The origins of this approach come from Bowlby and Ainsworth, who suggest that instead of subconscious drives, perhaps for food or sex, which become satisfied from contact with the mother, the bonding known as attachment theory, comes from the needs of protection and security (Fonaby, as cited by Shaver & Mikulincer, 2005).

A recent study used modern methods to carry out psychodynamic research, using methods of subliminal unconscious priming, followed by tests in which participants had to discern whether a string of characters formed a word or not, where reaction time was measured. This test found that proximity related words were discerned with a faster reaction time (suggesting higher accessibility) (Mikulincer, Birnbaum, Woddis & Nachmias, 2000, as cited by Shaver & Mikulincer, 2005).

Another method employed to study accessibility, used either a lexical task, or a Stroop colour-naming task, during which a subliminal threat or neural prime was presented. The ability to recall names of people whom participants considered as providing security were recalled quicker during this time (Mikulincer, 2002, as cited by Shaver & Mikulincer, 2005).

Both of these studies suggest the workings of a subconscious thought, and provide evidence in a useful light. Admittedly, the results are by some means vague. At most, they suggest the presence of subconscious activity, and point us in the direction of some of its applications. A report by Shaver and Mikulincer (2005) does say that much of the psychodynamic work is still done through careful and strategic introspection.

I heard it said that if we could develop a computer that could perfectly simulate the brain, we have no further need to study psychology. Indeed, empirical studies readily answer questions based on variables comprehend-able to human-kind. But the reason people have psychological issues is because they don’t quite understand  the brain. Such empirical studies provide essential foundations for psychology, however a lot can be done for applied psychology using psychodynamic paradigms.

How we got Split-Brain Studies

One more cringe worthy surgery to contemplate might be Split-Brain surgery. A process by which neurosurgeons cut the corpus callosum, that then stops nerve fibres carrying messages from one side of the brain to the other. In epileptic people, quickened activity between both sides of the brain causes epileptic fits, and therefore, separating them, reduces the rate of epileptic seizures. (Calson, 2010)

The first research in this field was done by Bykov in the early 1900’s, where he worked in animals in Pavlov’s laboratory. By 1924 he had discovered that sectioning the corpus callosum in dogs prevented the contralateral skin related conditioning of salivary reflexes. (Glickstein & Sperry, 1960)

Corpus Callosum

The Corpus Callosum. Source: Psych Web

The first human case was reported in 1940 by Van Wagenen and Heeren, when the callosum was split as an attempt to cure epilepsy. The procedure was successful both in controlling epilepsy, and in finding new research on the corpus callosum. (North Dakota State University)

Years later, scientist Roger Sperry, and colleagues, performed further research in to split brained patients. They found that, although in day to day life, their behaviour seemed practically normal, that the cut off communication meant that the two half brains were actually operating independently of one another. Their work on animals showed that each half of the brain could be taught contradictory activities, with no perceivable mental conflict (Sperry, 1975).

The same behaviour is exhibited in humans. Post split-brain surgery patients have reported that their left hand seems to have a mind of its own. They may find themselves interestedly reading a book, yet then spontaneously, and through no conscious choice of their own, put it down. (Calson, 2010)

The different experiments performed, assessing responses to stimuli, opened up a whole new dimension of research, examining the brain in a new, not previously obtainable situation. The studies revealed where both sides of the brain are specialised. Sperry was awarded the Nobel Prize in Physiology or Medicine in 1981 (Horowitz, 1981).

What impresses me here is how a potentially controversial operation endured through the ages, to be not only acceptable, but also incredibly useful both in controlling epilepsy and understanding mind. If the idea were suggested today, ethical alarm bells might ring, yet through a century long research process, we now have scientifically grounded theories on the callosum and the two sides of the brain.

Is this a game that we have to play in order for research to gain widespread favour? Is that the way it should be? How much control do we have over what science achieves for humanity? It’s as if science itself is alive and kicking, an intelligence in its own right, with whom we work, that we might progress.


Carlson, N. R., (2010), Introduction. In N. R. Carlson, Physiology of Behaviour (pp. 2-27). Boston, MA: Allyn & Bacon.

The Brain: Intel Inside?

When I’m not busy studying psychology, playing karaoke or persuing any of the other wonderful pass-times that we find as university students, I do like having a play on the computer. Now, please don’t judge me, I got bored of playing Grand Theft Auto long ago, and I am certainly not addicted to World of Warcraft. I enjoy real computing: programming, making websites, getting stuff to work and so forth.

Cognitive psychology is the study of how information is processed in our minds. It’s a study of knowledge, and how our brain manages “attention, creativity, memory, perception, problem solving, thinking, and the use of language.” (Neisser, 2009) This type of psychology has developed mostly since the 1950’s, and has been quickened through the use of computers.

The idea of using computers in the study of cognitive psychology is to replicate mental processes to learn more about them. Cognitive theorists have suggested that the mind contains similar logical methods to those of a computer. Ideas have also been proposed, that use neurons and their connections as a model for data structures and neuron firing, and spreading activation as a basis for algorithms. While there is no single computational research method, studies combining computation, mind and brain work together to help us deduce new ideas . (Thagard, 2011)

Critics of the idea say that a computer uses only syntax (instructions) in order to do its job. A computer cannot change its mind, it requires user intervention. The human brain processes are so intrinsic that they cannot be fully defined by a programmer. At best, computational processes interpreting activity can only be assigned to mental processes. It is even argued that the brain is not an information processing device at all (Searle, date unknown) In addition, the rate at which technology advances brings ever changing ways in which we program, meaning new methods of programming can come to light, that disprove previously accepted computational theories. (WikiEd, University of Illinois at Urbana Champaign)

I observe that various cognitive processes can be represented better than others, the deeper processes being naturally harder to replicate. We certainly can use computers successfully to some extent to hypothesize cognitive processes and make predictions. What is interesting to me is that many of these studies began in the 1960’s, when a whole city had ‘a computer’, and programs were punched in to tape rolls. My computer now has a dual core processor, but even eight core processors are widely available. Think about that in the context of how many things the human brain can process at once. And that’s not to mention the advancements in all the other component parts of modern computers. How much more do computers represent the human mind now? And how much more could they?

Can Computers Learn?

From the 1983 film: Wargames


Is Psychological Research Empirical?

Empiricism can be defined as “the doctrine that all knowledge is derived from sense experience.” [1] In other words, it is research that comes through our observations. In a very simple sense, I drop my pen, and I can observe that it falls downwards. This is the empirical evidence that supports the law of gravity. In the Publication Manual of the American Psychological Association, an empirical study is referred to as a report of “original research” (p. 10) Empirical research is important, because it can be verified. Sensual observations can be measured, and tangible readings can be taken.

In psychology, whether our research is empirical or not is controversial. As we seek to form a scientific study, we look to gather empirical data. Often, our data comes from Introspection, where a subject describes their feelings. Immanuel Kant (1724-1804) argued that psychological research could not be considered empirical, because “mental events cannot be quantified” (Fuchs & Milar, 2002). He suggests that these mental events cannot be analysed either in the laboratory, or using mathematical analysis. As these thoughts and feelings are verbally conveyed, and then interpreted by another, meaning can change or become lost, resulting in a game of psychological Chinese whispers.

Kant suggested instead that we should use physical observations, things which can be measured. Indeed not all data is gathered by introspection, and in recent years, technological advances have allowed more and more alternative methods for gathering empirical data.

When studying the brain and the nervous system, extensive methods and tools are now available to monitor activity within these areas. The process of “Single Cell Recording”, shows us how different specialised cells are in place to detect different types of image. This has given us valuable insights on how vision works. (Gleitman, Gross & Riesberg, 2011, p.105) These processes do not use introspection, and deliver more solid results that we can work with.

While these new advances in technology often allow new and more accurate methods of empirical study, I do believe most of our research still involves a form of introspection.

A study on facial emotional expressions revealed that some basic emotional expressions are found across different cultures. (Ekman & Friesen, 1975) In this study, participants were shown faces and asked to categorise each face, as to what type of emotion it was displaying. Participants were given six different emotions to choose from. Based upon this research, a further study was carried out more recently, which used the same method of asking participants to categorise facial emotions, however this time, eye-movement was tracked using modern equipment. (Corden, B.; Chilvers, R. & Skuse, D. 2008) This study found that people’s eyes avoided looking at “emotionally arousing” stimuli, such as “fearful and sad expressions”.

During these experiments, introspection was used, in conjunction with modern technology, in order to assess participant’s perceptions, before further studies and conclusions could be made.

So, are psychological studies empirical? The introspective data is not entirely tangible, it is opinion based. The same stimulus could be described or categorised differently by different people. At the same time, introspection is still a very useful way to gather data. In my view, the question of validity plays a role. Does it measure what it claims to measure? I believe that generally it does. I admit that that will result in the results having a weaker foundation, but in most cases they are sufficiently valid to draw valuable conclusions from.


[1] Dictionary.com

Alfred H. Fuchs & Katherine S. Milar (2002). Psychology as a Science

Gleitman, Gross & Riesberg (2011). Psychology 8th Edition.

Ekman, Paul & Friesen, Wallace V. (1975). Unmasking the face: A guide to recognizing emotions from facial clues.

Corden, B.; Chilvers, R. & Skuse, D. (2008) Avoidance of emotionally arousing stimuli predicts social–perceptual impairment in Asperger’s syndrome.