Thinking Together


Download 325.51 Kb.
bet1/2
Sana11.06.2020
Hajmi325.51 Kb.
#117617
  1   2
Bog'liq
Maftun cw



Thinking Together

This approach ‘depicts children’s emergent understandings as the product of the collective thinking made available to children through observation, joint activity and communication’ (¥ercer 2000: 1SS). Wegerif et al. (2005) have shown how this pedagogical approach, set within a variety of curricular contexts can result in the development of children’s talk about thinking. ¥ercer et al. (1999) showed that the pupils exhibited better scores on tests for non-verbal reasoning scores (through the use of Raven’s Progressive ¥atrices) and in understanding of curriculum subjects compared with those in ‘control’ classes. Wegerif et al. (2005) analysed the use of key words in contextual talk (KWIC) developed by Wegerif and ¥ercer (1997) between students. They scrutinized verbal exchange for utterances including ‘. . . because . . .’, ‘I agree’ and ‘I think . . .’ Their findings suggest that teaching students to work together and engage in dialogue of a particular nature using the Thinking Together approach can improve the quality of learning interactions. Wegerif (2002) extends suggestions about facilitating thinking between peers by using computers. His evidence suggests that teaching thinking using software can provide a ‘tutor’, ‘mind-tools’ and support for learning conversations. He elaborates, though, that success crucially depends upon how the technology is used. Much depends on the role of the teacher (Wegerif 2002: S). ¥cGregor suggests howthe teacher interprets and implements any programme (see Figure 7.1) and enacts a thoughtful pedagogy (see Figure 7.2) can significantly influence successful development of pupils’ thinking.



Conclusion


Interpretations and communications about success of thinking pro- grammes can sometimes, as Nickerson (1988) suggests, be ‘unsubstan- tiated claims, one-sided assessments, and excessive promotionalism’. He reflects that authors developing and assessing their own programmes need ‘more self-criticism’, especially as it is somewhat ‘paradoxical that some developers of programs to teach critical thinking have had less than severely critical attitudes towards their own work’. Claxton (1999) reiterates that many ‘entrepreneurs’ make ‘grand claims for proprietary methods that are not substantiated by objective evaluation’, describing how many of the published evaluations have been based on short trials with small numbers and using measures of limited validity. Arguably positive effects of some of these ‘novel’ thinking approaches could be due to the ‘Hawthorne effect’, simply because they offer the learners such a.

Programme design and materials




Student participation in, engagement and achievement through the programme






Teacher integration and implementation

Notes, The efficacy of a thinking programme can be influenced by a number of factors:



  1. The design and presentation of the materials themselves and their ‘fit’ with the normal school policies and practices.

  2. The teacher and his/her interpretation of the materials can hugely influence the way the thinking lessons are conducted. There may be pedagogic tensions because the programme is not similar to or easily incorporated into the teacher’s usual approach. The teacher may not receive appropriate training and support to interpret the materials with the originators intent (further elucidation in Figure 7.2).

  3. The involvement and participation of the students, coloured by their perceived value of the thinking programme, will influence the nature and extent of their achievement.

Figure 7.1 Connecting influences on thinking skills programmes in schools.
contrasting experience to the usual lessons and learning (predominantly transmission or learning-by-rote). The impact of studying effects of a thinking programme that focuses more attention on the students, provides unusual and more engaging activities, may not be sustained once the initial, intensive instruction for novice teachers and students ceases (Nickerson et al. 1985). Nickerson et al. (1985: S16) list the plethora of IQ, intelligence, mental abilities, verbal and non-verbal reasoning, achievement tests and various other puzzles and assessments that have been (and in some cases are still today) used to assess the impact of various thinking programmes. As indicated in Figures 7.1 and there are many factors, however, that influence implementation of a thinking programme that have not been assessed, measured or even acknowledged in some studies.

      1. Understanding philosophy of the programme or approach.

        • Is the process more important? . . . or is the content more important?

        • How far should students understand the philosophy – there are different aims to ‘content’-driven lessons.

      2. Be familiar with and understand how the materials work (in a practical management and organizational sense).

        • Should there be demonstration; modeling; discussion or reflective collaboration of important or key points?

        • Should everyone have a copy of all materials or only one per group?

        • Should the materials be used selectively, differentiated or as is? S Recognize what pedagogical approach(es) is/are appropriate.

        • Which aspects of a lesson should be explained (or given)?

        • In which aspects/parts of lesson should constructivist/social constructivist or sociocultural emphasis come into play?

        • When should I ask questions? What kinds of questions? When should I give the right answers? . . . or shouldn’t I?

        • How big a group should it be? How should I orchestrate for the most effective discussion?

        • Should I apply particular ground rules/behavioural expecta- tions?

        • How much freedom and time should they have in discus- sions?

        • When is best to reflect? (Only at the end . . . or part way through?)

        • Should I provide visual/auditory and kinesthetic representa- tion of the introduction/tasks and plenary?

        • Will my students talk to each other and disagree without malice?

        • Will my students understand ‘sharing ideas’, allowing each other time to explain what they think and why? Will they be mutually supportive? Do they know how to be collaborative?

4 Consider how it may need modification and differentiation for particular students.

  • Will my students understand the terms/words used?

  • Will my students see how it helps them with . . . x and y?

  • How will my students connect the ideas to ‘life’ or other problem solving situations?

  • Supporting and applying the correct thinking processes.

Will the way I do it mean they have ‘analyzed’ ‘compared’ and ‘contrasted’; ‘synthesized’ ‘sequenced’ correctly?

  • Will it matter if I have 30 different ideas and outcomes for the same task?

  • How will I assess their thinking capability?

  1. Reflect on the success of the lesson/programme (both thinking and learning)

  • What needs improving next time?

  • How will I improve it next time?

  • Is what I am doing good enough to make a difference?

  • Who can help me develop better thinking in my lessons?

Figure 7.2 Examples of teacher concerns and potential influences on the implementation and success of a thinking lesson/programme.
Many schools have engaged in brain-based theories of learning. Accelerative learning is also called ‘accelerated learning’ (not to be confused with interventional €ognitiue Acceleration), ‘whole-brain learn- ing’, ‘brain-compatible learning’ and perhaps more holistically perceived as ways of provoking the neuro-anatomical brain and engaging the mind to learn through multi-modal strategies that use both hemispheres of the cerebral cortex. The emphasis is on reducing stress so that the learner feels relaxed, sufficiently for the blood to nourish all parts of the brain and physiologically enable the body and mind to learn most effectively. It is usually understood to be a battery of techniques (Noble and Poynting 1998) designed to create a relaxed but motivational learning environment, in which students’ individual learning styles and different parts of the brain are catered for through an array of strategies involving stimulation of the senses, memory aides, changes of activity, positive affirmations, and so forth. These techniques are often sold as ‘based on research’ (Smith 1998: 28; Smith and Call 2000: 26–7). However, Bruer (1997) suggests that there are overgeneralizations (and misconceptions) about what we know about the brain and how that can inform educational practices. Bruer cautions about the development of brain- based education. He describes howit consists of a mixture of rather basic but quite dated results from cognitive science and experimental psychology mixed in with really bad brain science. He describes how the synaptic development of children up to the age of 10 is more rapid than at any other time. Early childhood experiences fine-tune the brain’s neural connections, those that are unused wither and those that are used repeatedly are maintained. Thus by providing rich, stimulating environments at this time, children can develop more dendritic connections. It appears that lack of stimuli may result in potential loss of learning opportunities. Bruer suggests that we should remain sceptical about brain-based educational practice, and look more carefully at what behavioural science can convey about teaching, learning and cognitive development.

There is much evidence that a focus on thinking can improve students academic performance, but the varied nature of evidence does not enable easy analysis of ‘what’ exactly influences improvements in thinking and learning performance. The ‘what’ could be the varied stimuli, received through the programme, its design, the nature of the thinking tasks, the progression from simple to more complex activities, and the regularized intervention. It could also be the change in teacher emphasis, developing more sophisticated methods of questioning, developing more cognitive challenge, modifying expectations of students’ cognitive engagement, refining learning objectives to include the procedural aspects of thinking as well as the conceptual understanding achieved, changing the social interactivity between students, valuing dialogue more, recognizing success through a wider range of media, encouraging more bridging from specific contexts to everyday life . . . and transferring their valued pedagogic shifts to other ‘non-thinking’ lessons too.

A wider acceptance of educational research that provides rich qualitative evidence supporting quantitative findings could provide valuable insights into what works and how. Systematic reviews tend to use established or intelligible tests of some kind, which immediately assess some cognitive skills and not others. Reviewers also tend to assess carefully the structure of the programme(s) and not always, how it is implemented by practitioners in their schools and classrooms. The outstanding issues of curricular design, the nature of pedagogy, the culture of classrooms, teachers’ beliefs about learning as well as the nature and impact of their professional development all require further clarity, investigation and connection with thoughtful learning. How- ever, what is more obvious are the common aspects of successful thoughtful pedagogies that include more open approaches to the development and reflection of and about thinking processes. Interactive classrooms that value talking about thinking; offer challenges or tasks that involve collaborative participation; involve questioning, predicting, rationalizing, recognizing alternative perspectives and certainly being metacognitive (¥cGuinness 2005) are all features of classrooms where thinking is actively and successfully pursued.



Slavin (2004) highlights howstudies exploring what and how require careful design to provide evaluative evidence that is reliable and well founded to enable practitioners and policy makers to make informed, discerning decisions. Thoughtful experimental design is desirable so that justifiable evidence can be generalized and presented for educators to make informed decisions about curricular and pedagogical issues. As Bassey (1999) claims, we need more fuzzy generalizations from case studies of a more qualitative and illuminatory nature. Studies of the impact of the nature of pedagogy on thinking are emerging (Dawes et al. 2000; McGregor and Gunter 2001a, 2006; Leat and Higgin 2002; Adey 2004; McGuinness 2005), but more insights are needed to further illuminate how and whμ some programmes appear to be more effective than others.


Download 325.51 Kb.

Do'stlaringiz bilan baham:
  1   2




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©fayllar.org 2024
ma'muriyatiga murojaat qiling