There was a time when the core experience of higher level study involved sitting in a raked lecture theatre, listening in silence to the learned individual at the front impart his or her knowledge to the ranks of receptive young minds. Universities in particular have had this as their modus operandi for hundreds of years, be they the elite dreaming-spired bastions of education or a lowly nouveau upstart fashioned in red brick and concrete. To some that was a golden age, when to be an academic still verged on being a calling, with the goal of ennobling one’s mind through research, which, through erudition and the responsibility of lecturing was sifted, distilled and passed on to future generations.
As a pedagogic delivery method, the lecture format – supplemented by small group seminars for intensive examination of specific topics – remains almost unchallenged. Indeed, with every successive year of undergraduate intake it re-establishes its dominance, and with University lecturing still remaining the only area of teaching in which it is possible to practice without a teaching qualification, many future academics are likely to return to their own undergraduate experiences when embarking upon their own careers as the guardians and disseminators of knowledge.
A succinct and not atypical defence of the lecture as a teaching method is provided by Bruce G Charlton, MD in his 2006 article published in the journal Medical Hypotheses. Knowledge is important, yes, but so is the lecturer as a single source of authority and information. The undergraduate student should simply absorb and be able to re-present the accepted arguments to a sufficient standard to evidence the extent of their learning; original interrogation of concepts and accepted wisdom should be reserved for advanced post-graduate study, once a thorough grounding in the methodologies of the discipline has been acquired.
The lecturer as the source of all knowledge on any discipline belongs to an age that has had its day. Learners are now in the driving seat, empowered through technology and a wide variety of social networks to be able to shape their own unique learning experiences in a style that suits them, at a time and place that can fit around their daily lives. Anyone in search of knowledge now uses the tools available to them as a first port of call – the search engine, socially-developed and maintained resources and encyclopedias, and the freely available courses provided by the likes of Open Learn, a member of the OCW Consortium. These courses are defined as “free and open digital publication of high quality university‐level educational materials” which “are organized as courses, and often include course planning materials and evaluation tools as well as thematic content. OpenCourseWare are free and openly licensed, accessible to anyone, anytime via the internet.” (OCW Consortium) The advantages of the OCW approach are self-evident: a basic structure and content path is provided by the learning materials, a guarantee of quality is implied by dint of authorship by recognised authorities. However, for learners the single greatest attraction is probably the opportunity to fully integrate learning into a social experience through participation in learning clubs and the like. To connect with learning in the 21st century is to connect with your peers and build, refine and expand a collective understanding through debate and in the light of individual experience that will in turn influence the thinking and learning methods of future generations.
This online panel discussion is your opportunity to debate the strengths and weaknesses of both traditional and open educational models. To start with, share your opinion and why you hold them. Indeed, should the existence of one always preclude the need for the other? I will moderate discussion as it develops.