Raspberry Pi Data Logging Sensors

Having pursued a career in data and analytics, I’ve been thinking recently that it would be fun to take on some kind of data logging project at home. In light of that, I’ve been looking at a few sensors available for the Raspberry Pi that I could use.

Temperature

Probably one of the most straight forward measurements would be temperature. With just a temperature sensor and the right resistor, the Pi can measure temperature and capture it through the terminal window. With that possibility, it is then very straight forward to log it in an ASCII file, or even drop it into a SQL database. Using a python module such as matplotlib the system could create a constantly updating graph that could be shared to a web server, having a very straight forward sensor, the results accessible via a browser.

The Sense Hat

A more refined piece of hardware already exists, and this is the Sense HAT. If I understand this rightly, the sense hat sits on the Raspberry Pi board directly over the GPIO pins. As well as a temperature sensor, it also senses air pressure and humidity, and it even has a built-in accelerometer. It looks like a very capable board for a fantastic price. It was also used recently for the Astro Pi project by astronaut Tim Peake. It looks like a very straight forward, easy to set up means for learning data logging.

Weather and Air Quality

I also learned that weather is an interesting option for Raspberry Pi data logging. Recently, a set of schools were distributed with Pi based weather monitoring kits. Unfortunately these never made general production, however there are several guides available for sourcing your own components and making your own.

Looking at the weather stations got me looking at something I’m more interested in: air quality. In recent years I have thought more and more about air quality. In 2013 I spent Christmas in Salt Lake City, Utah. While I was there, I saw something I’d never seen before: visible smog. I would say Utah is no more or less car loving than the rest of the USA, but a few geographical features of the landscape make it prone to this inversion. The area around SLC is in the Salt Lake basin. It is surrounded by mountains and little or no weather, so the pollution just gathers in the bottom of the valley until a weather front blows it away. Anyway, I digress. I never realised how real it could be.

Now, I live in Leeds which very seldom has smog, if ever. Even so, British cities have been criticized for their poor air quality recently, so it would be interesting to measure air quality around here and get some of my own statistics on what’s going on.

rsz_sensly_hatThere looks to be a few sensors available. Most interestingly is the Sensly HAT, which senses particulate matter, carbon monoxide, nitrogen oxides and the other poisonous gasses in the air. It’s still in development with a hopeful release date of September 2016 and is being developed by Altitude Technology with the support of a number of organisations. I like the look of this product; it carries the backing of the University of the West of England so it should be well refined and produce some good data. I’ll be interested to keep and eye on its development.

There’s a few more obscure sensors already available for the different measures, such as the SEEED STUDIO 101020078 Grove Air Quality sensor. These look a little trickier to work with, and although I gather they could work, an electronics novice like me might struggle.

So there we have it, a set of Raspberry Pi sensors measuring a variety of interesting constructs. I particularly like the Sensly HAT and I hope it gets into production soon. It would be most excellent to have a crowd sourced air quality measuring scheme to support air quality research. It would also be an awesome DIY data logging/analytics project too.

Finishing University

It is June 2016, a little over 5 years since I returned home from being a missionary for The Church of Jesus Christ of Latter-day Saints. In September 2011 I started my undergraduate psychology degree at Bangor University, and in August 2015 I finished my masters degree in Cognitive Neuroscience at The University of York.

When I started at Bangor I had visions of coming to be a clinical psychologist and writing regular blog articles about the fascinating things I learned. As it happens, neither of those happened. University life turned out to be quite hectic, as I tried to juggle my university life in remote Bangor, my church life and my part time job in the local supermarket. I didn’t manage to get into the aspiring culture of volunteering in the right places to enhance potential applications to the highly competitive clinical psychology courses. I also felt weary over the routes into academic and psychotherapy careers. Plenty of people told me I had the potential, but while the things I learned interested me, I just wasn’t overly motivated to pursue an academic career in them.

Fortunately, I did find one thing very fascinating: programming. I did my masters degree in Cognitive Neuroscience. Cog Neuro is an area where a lot of new and novel analyses are performed. In many cases, the software doesn’t exist to analyse the results. It is up to the researcher to write the programs to perform the calculations. My final project involved a lot of coding and a novel approach to analysing data. It awoke a dormant part of me that I haven’t visited for a long time.

My masters project showed me what I was good at, and what I enjoyed. Before I went to Germany to be a missionary for my church, I did lots of programming and different IT projects, but since returning and deciding to pursue a psychology career I’d put it on the back burner. But it all came back to me pretty quickly, and it motivates me to the extent that I don’t always want to put it down.

I decided that a programming and analytics career would be for me, and I have since been fortunate in finding employment with Callcredit Information Group. I won’t go into too much detail about my job, because I think it’s important to respect my companies privacy, but it has been a terrific opportunity to develop my programming knowledge and apply my analytics skills to some interesting problems.

The other thing I mentioned at the top was my dream of blogging my learning at university. It’s a wonderful ideal that I didn’t manage anywhere near as often as I had hoped to. Apart from where it was a mandatory requirement for passing a course, time tended to get the better of me. I also didn’t want to give away too many details on the research projects we were doing, in case I jeopardised potential publication opportunities in the future.

So here I stand, having made a lot of progress yet ending up in an entirely place from where I intended. That is fine, because I’ve learned to play to my strengths and discover what motivates me. If you read back through my blogging history, you’ll see that I write a lot about academic motivation. Knowing how to manage knowledge and experience is key to keeping up with an unpredictable economic climate. I have a job I enjoy, where every day is different, where I have problems to solve and where I made a meaningful contribution to company processes. It sounds like the perfect cliché!

As for the blogging, it’s still a goal to write a meaningful and interesting blog about something. I’m just not entirely sure what. I’ll just take an open ended direction for now.

What is Higher Order Thinking?

Highlighted articles. Text courtesy of Sargolini et al., 2006.

Highlighted articles. Text courtesy of Sargolini et al., 2006.

I was working on my masters thesis today, and I started to think fundamentally about what my brain is doing as I work on an academic project like this. Like many other students undertaking academic writing for the first time, I have found it quite hard at various points. Sitting with a journal article in front of me, or having a blank document on the computer screen can be daunting.

I have found the best solution to be to break everything down in to small pieces. If I don’t know what I’m looking for in a journal article, I begin by just highlighting different bits of key information, such as areas of the brain, or the rationale for the study. Instead of viewing the project as a linear entity that I just need to get on and do, I look only as far as the first step that will open more doors for me.

When I view it this way, I begin to realize that academic work is really just making lots of small decisions. Each decision requires me to take what I know, and decide how it relates to something else. This process is a microcosm of thousands of such decisions that will be made between the start and end of a project. For me, thinking about it this way builds my confidence and makes my work more achievable.

When I tell people I’m doing a masters degree in cognitive neuroscience, I often get replies such as ‘I’d rather you than me…’ or ‘I’m too stupid to do anything like that…’ My message here is that this need not be the case. Achieving in university need not be something just for the ‘smart’. All that I do at university is practice decision making in such a way that I produce scientific research. I suppose somewhere along the way I also memorize some stuff, but that is really just a natural bi-product as I learn this new way of thinking.

The point I want to make about this is that anybody can do it! I’m not saying it will happen at the click of a finger, but I am saying almost anyone can do this with the right mentoring and practice. This, however, is what separates higher education from regular learning and work. There’s lots of very straight forward jobs out there where we know exactly the what, when and how. That’s fine; there are plenty of jobs that need doing, which are highly valuable (or ought to be), but that don’t take much higher order thought. But there’s plenty of complex problems to solve too, which can be addressed by a higher level process of thought. I hope when we view higher education in this way, it can become more graspable. Rather than a scary and mysterious realm of perpetually hard work, a place for the mind to be excercised.

Raspberry Pi meets Cognitive Neuroscience

Can the Raspberri Pi be used to process 4d neuroimaging data?

Can the Raspberri Pi be used to process 4d neuroimaging data?

When I started my masters degree, I didn’t entirely know what I was taking on. I chose to study cognitive neuroscience because I knew this was an area which presently receives a lot of funding for PhD research, and also because it seemed like a robust, scientific approach to psychology. After a few short months I have come to discover that research in this area is something that I really enjoy, and that is largely down to the opportunities to program and develop computerized research tools.

At the same time, the Raspberry Pi Foundation have announced and launched the new Raspberry Pi 2: a credit card sized motherboard which can be run as a standalone personal computer. I’m not going to go in to all the ins and outs of the Raspberry Pi (or Pi, for short), you can find more about it by visiting their website at www.raspberrypi.org. Instead, I want to talk about what it brings to the study of psychology.

Anybody who has studied psychology as a science will know that a lot of research is made up from undergraduate students sitting in dark rooms performing mundane tasks while their reaction times were being measured. This has been terrific for the creators of Matlab, whose IDE has facilitated many a psychology-button-pressing extravaganza. Even so, there is a new kid on the block who is gradually gathering momentum, and that is python.

One of the modules we undertook this spring semester was programming in python, something I hadn’t done before. I was keen to do at least some of this on the Pi, so as to justify my impulsive buying of it. My lecturer found kind of cute (that would be “cute”, except that he didn’t say it aloud). I persisted non the less, until I found that at least for graphics the Pi weren’t altogether compatible with the PsychoPy module we were using. Unfortunately it does not support Open GL graphics, so it actually struggles with button pressing experiments. But it was a start, it wet my appetite, and it was fun.

The real fun has commenced as I’ve begun my dissertation project. We’re doing some work on spatial cognition (how space is represented in the brain) using functional magnetic resonance imaging (fMRI), and part of this involves calculating different test statistics on 3d data on a voxel by voxel basis (voxel = volumetric pixel). There is lots of python modules optimized for flattening, analyzing and reassembling these datasets and we’re using them to build novel analyses which haven’t been done before. On this frontier of neuroimaging research, the raspberry pi stands gallant as my building platform and testing station to produce these scripts.

I’ve been using the module ‘minepy‘ to calculate the Maximal Information Coefficient (MIC) for each voxel in a set of datasets. The modules install seamlessly on the pi from the different repositories, and scripts can be written elegantly through the ‘spyder‘ IDE. I’m most excited most of all about the sheer size of the data we are working with. Each dataset is 64x64x26 voxels, which means 106496 calculations. On the Pi, each one takes about a second, and (when I close the GUI) all in all it takes about 2 hours. For me to run this on all 48 of our scans, that would take about 4 days. For processing the whole lot, fortunately we can pass our script to the cluster computer at York, which (assuming it is coded correctly) should polish it all up in around 16 hours.

It looks like something out of a movie, but its real!

It looks like something out of a movie, but its real!

What I really like about this is how real it all is. I’ve taken on many little projects here and there over my years, but none of them have ever really meant anything. Sure, I learned a lot, but I wanted to put it into action. Here, when those lines of text fly up my screen for 2 hours, I look forward to the output for reasons more than just knowing it worked. It also gives me the opportunity to work on my projects at home using a linux software environment, which great since Python runs natively in Linux.

This also ticks another box for the raspberry pi foundation. Universities and hospitals around the world have powerful workstations and supercomputers which they use to process neuroimaging data. That is very sensible, considering the sheer volumes they work with. But can it also be done on a £25 printed circuit board in Billy-whizz’s basement? Yes… It can!

How Metacognition Might Have Saved Tesco’s Bacon

tesco_2584551bThis week, the UK supermarket Tesco has landed itself in trouble over providing incorrect figures of its profits to the city (the stock markets). Now questions are being asked about whether Tesco bosses were being deliberately misleading, or instead incompetent. From a psychological perspective, I believe the answer lies in a factor named ‘metacognition’.

Some of you who follow me on Facebook or other social media channels will know that I have been making a bit of noise recently relating to this topic. I’ve been publishing my latest metacognitive awareness score, which has been calculated through a smartphone app, which I have been working on during the last few months. With this blog post, I intend to answer the questions of what it is about and why you should want to download it.

So, what is metacognition? Metacognition means knowing about what you know. We are defining it as an awareness of ones own knowledge and mental processes. A higher metacognitive awareness will make you think better. It will help you to be more effective in how you handle your knowledge and your learning.

The app we have produced is based upon several years worth of research conducted at Bangor University. Through empirical studies, it was shown that this metacognitive awareness could be increased through confidence based testing. Confidence based testing is a secondary component attached to a question, wherein the participant not only gives an answer, but also an indication of their surety of that answer. As a result of these studies, we have developed the Cognaware app to replicate the study methodology for individual smart phone users.

How sure are you?

How sure are you?

This general knowledge quiz will reward confidence in correct answers, as well the users ability to correctly identify when they are guessing. On the flip side, it will punish users who are confident in an incorrect answer, while giving only a token point for users getting a correct answer thought to be a guess. The responses are then analysed, using signal detection theory, to give you your metacognitive index, or in other words, your ability to discriminate between what you know and what you don’t.

At this point, you may understand everything I just said, or you may find it all awfully complicated. Instead of worrying about the ins and outs of it, let me tell you why metacognitive awareness is important.

Have you ever listened to somebody harp on about something, when they don’t actually know what they’re talking about? Politicians and other people trying to save face do it all the time. Either these people have no integrity, or they have a poor metacognitive awareness, although in reality it is probably an interaction of them both.

Consider the way we handle knowledge, whether it be in business or higher education. Working with accurate facts and knowledge is the difference between a cutting edge success and a tragic failure. When it comes to hard line reality, sometimes saying ‘I think’, just won’t do. In the aftermath of the Iraq war, the evidence (i.e. the absence of weapons of mass destruction) revealed that the grounds for the invasion of Iraq were based on a hunch. When I listen to political debates, I am becoming increasingly aware of the absence of sources being used to back up politicians arguments. We have seen some tragic and foolish decisions made off heuristic knowledge.

Bringing it back to Tesco, I believe that if more people within the senior ranks of the organisation had a higher metacognitive awareness, somebody would have blown the whistle before it came to this. Either through an increased conviction in recognizing discrepancies, or through being more willing to face the reality that the city would notice a £250 million deficit, that ability to discriminate between hopeful ‘I think’s’ and reality might have saved four senior executives their jobs.

Now this isn’t a miracle pill. It’s not a get rich quick scheme. It’s an evidence based app which will train you in a wise thought pattern. But now you are aware of metacognition, and the role it plays, and so I ask you: is this a skill you want to work on? If it is, a daily quiz on Cognaware is a fun way to do that. Remember, it’s not what you know, it’s what you know about what you know.

Cognaware is available on the Apple and Google Play stores. For more information, including learning about the peer reviewed literature that supports metacognition, please see www.cognaware.com.

Re-inventing the Computer

I’ve spent a long time now, looking at different ways that psychology might be applied to improve education. I’ve learned about networked learning; how knowledge is stored more intelligently across groups. I’ve researched motivation; the ways in which student empowerment and student directed learning creates a more whole educational experience in the individual. I’ve read about the flow state of mind, being captured by the moment in a most pure state of intrinsic motivation. I’ve found out about the dangers of carelessly deployed technology, which only teaches students to accrue points, or merely provides a virtual alternative to what worked perfectly well with paper.

Indeed, I would say that not many of the present classroom applications of technology are really hitting the nail on the head. There is, however, one computer game, which seems to apply these ideas exceedingly well: Minecraft.

This retro style indie game has risen to be the best selling PC game of all time. It is an open ended, sandbox style game, in which plays build things through placing and removing blocks. The game has developed over several years, giving users items such as switches, power sources and hoppers, allowing for some quite smart mechanisms to be created. It began with the automation of ‘crafting’ (putting several raw materials together to create a new item), however as peoples ingenuity has developed, so have their creations. Right now, they’ve advanced as far as having built 16-bit computers. In a very real way, the entirety of the enlightenment and the industrial revolution have now repeated themselves – in a cult computer game!

The Analytical Engine, the mechanical computer proposed by Charles Babbage.

The Analytical Engine, the mechanical computer proposed by Charles Babbage.

Now, you need to think about that for a second. Think about what a computer is, and how they have developed. What is now done in a microprocessor, was originally done through a contraption which filled a whole room. Furthermore, before even that, the first computer ever to be invented was entirely mechanical. Now, we have an army of young people replicating these archaic structures using Minecraft.

This community incredibly well networked. Take a look at the number of examples and tutorials that come up with you search ‘minecraft redstone computer‘ on Youtube. When one learns, the wealth is shared. And when the one shares their understanding with the many, that individuals own understanding is strengthened. Ideas bounce serendipitously, and the evolution of these systems has been rapid. The bottom line of it is this: young adults, teenagers and children even, within this community, can master the very fundamentals of modern day computing. I don’t even understand that, and I have been a computer enthusiast since I had my baby teeth.

This is the type of educating that is fun, motivating and above all, highly effective. Each and every learner within the Minecraft community is participating in a race to the top. These skills and this understanding are the very thing that will drive knowledge economies to excellence. To this I would ask what parallels can be drawn, to make an educational model as resonant as the Minecraft community.

It would be very different. It would challenge all convention. It would likely be chaotic. However these things have all been recognised as attributes contributing to a successful knowledge economy.

It would be a brave teacher to set up a Minecraft lab inside their classroom, but hey! Here’s to the crazy ones right?

Meditation and Mediation

I read a book recently that has really fascinated me. It’s entitled ‘Mediated – How the Media Shape your World’, and it’s written by Thomas DeZengotita. It is a powerful demonstration of just how much the media influences our lives. One of the key points DeZengotita makes concerns authenticity. In a climate of ubiquitous mediation, nearly our entire identity is formulated out of snippets of representations which we have gathered through film, politics, reality television and more.

The premise of it all is this: everything around us, whether it be road signs, magazines, film, social media or advertising is all addressed at YOU. While you may be just one of millions, nearly everything you see is addressed at you directly – and me – and all of us. This is ultimately very flattering. The flattery becomes manifest when individuals are trying to decide which identity (or which collage of mediated identities) they intend to present to the world. In times past, such behaviour may have been reserved for royalty, or significant public figures. However now, everybody is trying to write out the story that is their life.

This is all well and good (well, maybe), until it hits crisis point. Until there are so many representations, so many identities, so many statements to be made, that an individual simply cannot handle it all. The tyrannous belief that one MUST be successful in this endeavour can throw many individuals in to a neurotic state. It becomes the source of a lot of anxiety and depression.

Mindfulness Meditation

It should therefore be no small wonder that mindfulness meditation seems to be the opposite of this carry-on in every way.

Mindfulness is a relatively new type of meditation, in which one seeks to become a passive observer of the situation around oneself. Thoughts, emotions and sensations are monitored by the individual so as to reach a state of detached awareness. A recent special edition of the New Scientist reviewed mindfulness and other meditation research, and talked about the way meditation trains people for improved emotional regulation.

Whereas all things mediated are addressed at you, meditation detaches you from that setting, and helps you inhibit the urge to pay homage to each stimulus you encounter. Considering that one of the approaches to cognitive therapy is to help clients stop processing the negative automatic thoughts, this helps piece together the picture of how mindfulness works.

It would be fascinating, with all the knowledge we have today, to go back to a pre-technological era and measure the relationship between psychological wellbeing and a far less ubiquitous media. However with DeZengogita’s ideas on media theory, and the principles of mindfulness based wellbeing converging together, it paints an ever clearer picture on the cognitive aspects of wellbeing.

The Connectivist Riots

I’ve spent quite a lot of time recently thinking about things that aren’t real. I’ve realised, that actually, not a lot of things are real. Obviously there IS a lot of things that are real. The computer in front of me, and my car outside are both very real. However there is a vast array of things processed by the human mind, that have no physical form.

From a consumer perspective, consider the ‘Coca-cola’ brand. It has been said that if the entire infrastructure of Coca-cola’s operations were to be lost, it would be okay if the brand were preserved. Where does the Coca-cola brand exist? It is a representation in your mind.

The same can be said about psychotherapy. Cognitive-behavioural therapists will help clients recognise that many of their beliefs and fantasies, and how stopping a certain thought can solve the problem.

Many of the thoughts we have concern the qualitative (attributes), quantitative (numbers) and connective factors of the things we interact with. For example, many things are red. You cannot create redness though, can you? You could produce red paint, or show me something red, however red only exists in your mind.

As you can see, the role of these thoughts is as a means to an end. The end is to buy Coca-cola, live more comfortably, or enjoy the knowledge that you own a red car. These ideas and norms only work because of social connections which tell us our ideas are in sync with the wider world, and thus, something more real is taking place.

This is the origin of connectivist theory, a new way to view learning, as a process which takes place across a network of people. It has come to light at this point in time, as we now live in a highly connected age, where we also have a lot of mediated-but-non-existant stuff to sift through. Connections are powerful, as they give us a new perspective on things like accountability and authority. Suddenly, large groups of people are able to communicate, unify, travel or effect change.

The 2011 riots which took place across England.

The 2011 riots which took place across England.

The rise of mediation and connection has improved access to education, living standards, democracy and freedom of speech. It has also made possible less desirable things, such as the London riots, and mass terror. I wish to dwell on this point for a moment, because I thing it really does put things in perspective. The London riots began with a small group of youths, discontented with a court decision relating to a member of their community shot by the police. As violence erupted, it was exacerbated as it was shared across twitter. There was no specific factor that unified all who rioted, other than their access to social media.

I’ve often thought that if the connected world has the power to do what it did in England in 2011, it should have the power to do enormous levels of good too. It is connectivism that has given much positive publicity to Edward Snowden’s whistle blowing. (I mean, you don’t think it came from the establishment do you?) The massive quantity of ALS ice bucket videos is another example of the power of networked activity.

There’s a lot of power here, which can be employed in problem solving or divergent creativity. The power of connectivism is decentralised, and virtually impossible to take away, And I don’t think that is a bad thing, because it empowers individuals. In fact, the only thing that could stop it, would be for the energy to run out.

The Division of Knowledge

I was re-reading the paper by George Siemens entitled: Connectivism: A Learning Theory For The Digital Age. While I was reading, something hit me.

Siemens talks about the way the abundance of information in today’s world is forcing us to do things differently. He explains how the continuity that existed for people in days gone by is all but gone. People living in previous generations may have trained for one career and remained engaged with that their life long. Now, however, we have an ever growing pot of knowledge which would be almost impossible for any one person to handle or evaluate.

A solution, which Siemens calls connectivism, stores knowledge across social networks instead. In connectivism, groups of people benefit from laterally stored knowledge, which is then evaluated almost phenomenologically by the collective intelligence of the group.

Now all this is very smart, but where does it fit in to the real world?

Well. Once upon a time, a couple of hundred years ago, there was a different revolution: the industrial revolution. We discovered that we were sitting on an abundance of fossil fuels. Through organisation and the division of labour, the production capacity of what had been until very recently a manual process, grew exponentially.

In the UK, productivity benefits from the division of labour have been commemorated on the £20 note.

In the UK, productivity benefits from the division of labour have been commemorated on the £20 note.

 

This is the information revolution. Our fossil fuel is computing power. As we use one another as a surrogate for knowledge (Stephenson, undated), our modern replacement for the division of labour is the division of knowledge. Just as our industrial ancestors created highly productive patterns out of the chaos, businesses and researchers alike can benefit holistically from team work and effective divisions.

When I hear talk of the knowledge economy, I am aware that this is not merely a transition to office jobs. Those who will thrive will be those who can sift through the meaningless noise and identify the applications that will yield a real benefit.

Coping with Autism: My Perspective, Part 1

A powerful and open perspective from a friend of mine.

AUTHISTIC

I’ve been in one of those moods recently. The sort of mood which begs for some sort of output, some sort of creative release. The last time this happened, it was brought on by watching what turned out to be an abomination of a show. You could probably say that what brought this on is of more consequence.

This past week I was in Nottingham, volunteering as a counsellor in a youth programme called FSY (For the Strength of Youth). I was responsible for nine incredible young men, and it was one of the most rewarding and exhausting experiences of my life. I loved it. And yet, when it concluded on Saturday, I ended up in one of the lowest moods I’ve felt for a long time. Why? I’ll get into more specifics later on, but for now it suffices to say that I tweeted this at one of my…

View original post 1,725 more words