Chapter 22
In This Chapter
Structuring research reports appropriately
Recognising what to criticise and what to leave alone
As a cognitive psychology student, you’ll almost certainly be expected to write a research report. These reports are designed to present your own experimental findings: that is, you need to design, carry out and run your own experiment. After all, cognitive psychology is a science.
You have to follow certain rules when writing these reports, as set out in the Publication Manual of the American Psychological Association (APA). The APA developed these rules over many years to ensure that all scientists produce work that’s similar and that everyone can easily understand. Some of these rules may seem pedantic or even downright silly, but you must follow them.
Even if you don’t have to write a research report, knowing how they’re constructed is definitely useful. Given the ever-increasing number of online free-to-access scientific journals, more people are being exposed to cognitive psychology research than previously. Journalists read scientific reports and then write stories about them. Often, however, they make mistakes or interpret these reports incorrectly. Thus, being able to read scientific reports accurately is vitally important.
In this chapter, we present eight dos and two don’ts of writing cognitive psychology research reports. We give you the tools to write, and read, scientific reports appropriately. The tips are based on what professional researchers (like us) do. Therefore, check with your tutors that the advice is the same for your specific course.
We know that you expect good value from For Dummies books and so here’s a bonus tip. Take a look at these free online journals to get a feel for what published cognitive psychology papers look like: Advances in Cognitive Psychology (www.ac-psych.org
), Frontiers in Cognition (http://www.frontiersin.org/cognition
) and Journal of Vision (www.journalofvision.org
).
The APA’s rules for writing reports tell you what font to use, how to set the margins, what subsections to include and how to write scientifically.
Your chosen font needs to be easy to read in terms of design and size. Fonts such as Arial and Times New Roman are known to be easier to read than, say, Comic Sans or Brush Script. The size has to be big enough to read, but not too large to waste paper. So use size 12 Times New Roman. Also, always double-line (or 1.5-line) space your text. Reading is easier when you use extra line spacing and it gives room for your tutor to write comments on your work!
Research reports need to contain these seven sections in this order:
Tables and figures are great for showing off results, but do label them correctly and clearly and don’t put them in colour. For figures, label all axes and place a caption underneath (readers have to be able to understand a figure without reading the text). Don’t use vertical lines in tables, because it clutters the space and makes them hard to read.
In the report’s introduction, justify everything to do with your study, including why you’re doing it: that is, explain the purpose from a scientific point of view. Simply being interested in a topic isn’t sufficient (though it helps!); your research needs practical application.
The opening of an introduction usually sets the scene for the research. Explain why the research is useful for society and any practical reasons for doing the work. Memory research is often useful for understanding and improving eye-witness testimony, for example.
The rest of the introduction describes the kinds of work that have been published previously. These descriptions need only to include relevant and important studies – not everything you’ve read. In fact, include only the absolute minimum to get your point across.
Part of the fun of writing research reports and generally being a cognitive psychologist is getting to criticise existing work. No published papers are perfect (except ours, obviously!) so always look for ways of criticising them. Criticisms form the basis for pushing science forward and establishing new, better ways of testing things.
You can also criticise the appropriateness of the tools used to measure a certain effect. Does a particular experimental variable really measure what it’s supposed to? In other words, try to explain someone else’s results in another way. Develop your own explanation of their results.
Also up for criticism is the statistical approach used in a particular study – though this probably requires you to understand statistics (perhaps Psychology Statistics For Dummies by Donncha Hanna and Martin Dempster [Wiley] can help!).
Ensure that your experimental hypothesis fulfils these four requirements:
Method sections are usually divided into four subsections:
Materials: Describe everything you used in lots of detail. If you did a memory experiment with words, describe what type of words. Evidence shows that shorter words are remembered better than longer words, as are distinctive words, high-frequency words and so on. You can also justify why you chose these particular materials.
You can’t put too much information into this section about the materials used (though don’t include pedantic detail such as the type of pencil used to fill in a questionnaire!).
The whole point of reports is to collect data and show them off. Thus, the results section of a practical report is one of the most important – clarity and understanding are vital. Make sure that anyone would understand your results and that they relate to your hypothesis.
Always describe what your numbers are and what they mean. Basically, what can the scores range between? Also describe the trends and patterns in the data (that is, which condition produces a higher mean than another?). Then comes the pretty bit, the table or figure (it’s a graph, but the APA guides always refer to ‘figures’, and so don’t use the word ‘graph’ in your research reports).
The table or figure presents the means of your results, which summarise all the participants’ data nicely and succinctly. Throughout this book, we refer to experimental effects where something was ‘bigger than’ something else. These come from results of research reports. The beauty of figures is that they show visually differences between numbers clearly – but they aren’t as precise as numbers in tables.
You need to interpret your results within a theoretical framework. Refer again to the theories and background research you present in your introduction: are these theories supported? If they’re not, why not?
Ensure that your interpretation of results considers any arising potential bias or validity issue, including imprecision of measures. You want to explain the psychological mechanisms that account for the effect you’re investigating. These mechanisms need to be causal where possible – that is, try to explain why something happened the way it did.
With your results interpreted within various theoretical frameworks (refer to the preceding section), you can really shine by describing how readers can practically test between these theories. In this way, you can suggest new and important investigations based upon theory and extra background reading. Don’t simply suggest something because you think it would be interesting: only make suggestions for work because of some important theoretical use.
You can also propose future work that may make your results more generalisable to other samples of people, or even to the real world. In other words, you can say why your results are really important and what can be done to make them revolutionise the whole world and the way people look at psychology. Don’t be afraid to be a bit pompous!
Often professional researchers use the suggestion of future research as a way of hinting that psychologists should be paid more money to do more research! Think of it as being like a job request!
Often students make general and vague criticisms of their own and published studies, such as ‘the sample size is too small’, ‘the sample involved only psychology students’ or ‘a random sample would have been better’. All these criticisms are wrong. Here’s why.
When psychologists conduct and experiment and find a statistically significant result, the sample size is the right size. End of. This practice is a rule within statistics, because every time you test a new participant, it costs a lot of money. Testing the fewest number of participants is better to find a significant result. If you don’t find a significant result, however, your experiment may lack power (the necessary statistical quality to find a significant result) and require more participants – but better ways exist of increasing power than simply testing more people.
Don’t criticise the fact that you tested only psychology students. Would they perform differently from other people on your cognitive tests? The answer is almost always no. If you have evidence from published papers that says different you can use it, but you must have a theory. If you think that psychology students may know what an experiment is about, and the participants can influence the results, you have a bad experiment and need to design it better before running it.
Also don’t criticise the lack of random samples. Although they’re technically better than opportunity samples, achieving them is nearly impossible, as well as far too expensive and difficult to organise. Given that you describe the participants that you tested, the readers need to determine whether your results would generalise to other populations.
Students tend to make the mistake of criticising published and their own papers for lacking ecological validity, which is where the results obtained in the experiment are unlikely to replicate in the real world.
You can say that although the results may only generalise across a limited range of conditions, further work can use the experimental findings and try to replicate them in the real world. For example, if researchers investigating language find an effect in the laboratory, they can then devise a new way of helping people with dyslexia, and test this in the real world. Without the original lab work, the helpful new technique wouldn’t have been devised.