Listen-Read-Listen

Technique: Listen-Read-Listen

Here’s a technique that works well if you’re intentionally trying to develop aural comprehension skill. You need an audio source with text, so generally either (a) a recording that has an accompanying transcript (I use this for Gaelic with a 5 minute podcast that comes with transcript), or (b) a text for which a good audio recording has been made (for Latin there are quite a few good recordings of poems/letters/etc., which come to mind.

Step 1: Attentively listen to the audio.

Your goal here is just to understand as much as you can. If it’s totally incomprehensible then there’s probably some factor making it too difficult (accent? text? you’re not ready for this particular text?). You’re not trying to recall everything, and definitely don’t try and transcribe it (a different skill and a different task).

Step 2: Read the text

Now it’s time to pull out the text. Depending on your level and the text’s, this might be extensive reading, or it might be intensive. Reading will help make sense of what you heard. My suspicion is that the previous listening doesn’t contribute very much to how much you comprehend reading, but the reading does to the listening.

Step 3: Back to the audio.

So now you go back and listen attentively to the audio. You should understand a lot more this time! There’ll be things you can more accurately ‘pick out’ and recognise, and overall your comprehension will improve. You probably won’t understand everything, and you will feel like there are things you just read that you can’t quite remember while listening. Don’t stress, just listen and seek to understand.

 

And that’s it! Simple, effective, a good way to use audio but leverage it with written material.

Reading to Learn v Learning to Read

Recently I was reading a document about extensive reading and it highlighted the difference between intensive and extensive reading in the above terms (reading to learn, learning to read).

Intensive reading is reading ‘above’ our level, or sometimes below our level but with a lot of analysis thrown in. This is “learning to read”. It’s when we encounter a whole lot of ‘unknowns’ – unknown words, concepts, structures – and we need to do “work” to make a text comprehensible. It’s slow, and because the amount of “unknowns” is so high, we are not really reading. We are learning to read. We are using a bunch of tools-that-aren’t-reading in order to make reading possible.

Which is fine, there’s a place for this. Unfortunately this is almost everything that historical languages students (looking at you, Greek, Latin) do. They read texts that are far, far too difficult for themselves, and they agonisingly pull them apart until they understand the meaning. And then they go on.

Extensive reading is reading that is at, or even ‘below’ our level. It’s when you read for the sake of the message being transmitted by the text, you operate mentally in the language of the text, you don’t stop to analyse the text per se, though perhaps you might linger to savour the text! You can read a lot faster at this level, and you’re not looking up unknowns, except maybe a very occasional one that you kind of thought you knew the meaning of, but wanted to check or were just interested.

This is reading to learn. The skill being practices is reading, and the object is learning, not vice versa. This is what is missing from most language students’ practices. And this is what’s particularly hard for classics and biblical students. There simply isn’t enough material at an easy enough level to do “a lot” of reading. Better for Latin than any other classical/historical language, but still difficult to obtain. For Greek, a nightmare. I’m working on a little side-project to help with that (btw).

So, if you want to improve as a reader, or a language learner in general, you almost certainly need a lot more extensive reading.

(The document I was reading is the Extensive Reading Foundation’s  Guide to Extensive Readering, see here;

For a great presentation of this applied to Latin, see Justin Slocum Bailey here (31 min video))

A much shorter presentation of the case for Extensive Reading, again by JSB, here (6 min video)

Know/Don’t Know: the myth of binary knowledge in language learning.

The other day I was in a conversation and couldn’t for the life of me retrieve the Gàidhlig word for “question”. All I could think of was freagairt, which is “answer”. I had to ask what it was. It’s ceist, of course. Duh. That’s a word I “know”, or “am meant to know.”

But the real question is never ‘do you know this “word/phrase/structure/chunk of language”?’ It’s always, ‘can you comprehend this chunk of language right at this instant, or produce this chunk in a way that effects communication?’

Which means the strongly binary model most of us inherit of language learning, which includes “Teacher taught word X, therefore student learnt word X” (wrong not just for languages, but for instruction in general), and “You memorised word X, therefore you know word X in all circumstances” or even “you once got X right on a multiple choice question, therefore you can actively recall X for communication production”, and so on – these are just wrong.

‘Knowing’ is a lot fuzzier. It’s a huge range of contextualised, circumstantial, bits and pieces that determine whether communication is going to take place in any particular instance, and how well a message is going to go from producer to receiver.

Which is why, at the end of the day, “vocab testing” is mere approximation. It’s testing, “can you on particular occasion X, recall particular word Y (actively? passively?) in particular context/decontext Z which may or may not bear much relation to any genuine language encounter?”

It’s also why we should basically ‘lighten up’ on students. “I taught you this” has no real place in a language teacher’s teaching vocabulary (except maybe as a joke?). Students don’t really need to feel shame/guilt/frustration at not knowing a chunk of language in that moment, they just need the minimum amount of help to make the utterance comprehensible, so they can get on with getting meaning and so acquiring language. And the next time they encounter, or need, “chunk X”, it will hopefully come a little easier. Or the next time. Or the time after that. Or however many times.

Don’t use “means” when you mean “translates as”

I’m trying to cultivate a new habit, and the title is it. Everytime I find myself writing something like, “the word ὑπόστασις means “blah de blah blah'”, I stop and rewrite it to something more like, “the word ὑπόστασις translates as ‘blah’ or ‘blech'”.

The reason is that ‘means’ in these cases tends to perpetuate an implicit approach to language that treats it as mere code or cipher, as if other languages really encode ‘meaning’ that is genuine in English. Which is patently false. ὑπόστασις doesn’t mean “subsistence” or “person” or “being”.

On this issue I’m not trying to be some kind of hardline “no, you can never say X in one language ‘means’ Y in another”, but I do think it would serve our writing better to avoid the construction because of its implicit connotations.

This is particularly a problem with Biblical Exegetes and their tendency to say, “Ah, yes, the Greek word ‘means’…English.” Let’s at least start killing that.

Semi-regular rant on Greek language pedagogy

(I’m mostly in the midst of doing a lot of thesis writing, but thought I could take some time out to ride a hobby horse).

  • Knowing a language isn’t a qualification for teaching a language.

We usually think that knowing something is a pre-requisite for teaching it, and generally that’s true. But it’s also not a sufficient pre-requisite. Plenty of people know skills or competencies which they do not have the ability to teach very well. This is why teachers get trained. So they know (a) how to teach as well as (b) the material they will teach.

Why would you think a language was any different? Monoglots Anglophones are particularly susceptible to this delusion: “Oh, you know Spanish, teach so-an-so.” If you’re a monoglot L1 English speaker, have you tried to teach English? It’s not that easy.

Why then do we think that merely being a successful student of Greek or Latin or X-language turns one into a qualified teacher of the same?

  • Having a PhD in Greek linguistics or in New Testament studies indicates almost nothing about how well you can teach Greek.

Most seminaries use their New Testament faculty to teach Greek, on the theory that they’ve studied a lot of Greek and did PhDs with Greek. But following on from point 1, this is only incidentally related to knowing how to teach Greek. This guarantees that the methodologies used in seminary-based education for Greek will continue to passively reproduce ‘they way I was taught’ from generation to generation. Which is not best-practice in the field at all.

  • Knowing a language and knowing about a language are two fundamentally separate things.

Anyone who gets to the end of a grammar-translation based program ought to realise this. Knowing about a language – whether in the terminology of (traditional) grammars or in the jargon of the discipline of linguistics, is not the same as possessing a communicative ability in the language to read/write/listen/speak directly in the language. They are two separate things, and they are acquired separately. Most speakers of an L1 do not develop any significant ability to speak about the grammar of their own language, unless taught it explicitly and formally. Students whose primarily educational content is a grammatical description of their target language should end up with an ability to analyse and interpret it, but any genuine acquisition of the language is incidental, and sometimes accidental.

  • The cost of pursuing acquisition doesn’t mean surrendering analysis.

One of the arguments I most commonly hear against communicative-based approaches to language acquisition for languages such as Greek is that it means students will not learn to do the kind of linguistic analysis that is currently taught. That would only be true if a program were designed exclusively to provide language acquisition and deliberately avoided any meta-language discussion. There is no intrinsic reason why students could not be taught meta-language skills in addition to actual language acquisition. Nor, if we are honest, would it be that problematic or time-consuming to teach them to do so.

  • The cost of pursuing acquisition doesn’t mean “too long, too slow, too little.”

Another of the objections I commonly hear, is that while communicative-based approaches may be possible, they would take too long and too much time to reach their destination, time which programs and students don’t have. To which I have several replies. Firstly, this is largely untested for classical languages – there are so few programs running full-blown communicative-based pedagogies that evaluating whether it actually takes too long is not seriously possible. Assuming that it would is bad research methodology. Secondly, I suspect this is not a concern at the pedagogy of language level, but at the curriculum design of seminaries level. If students and programs don’t have time to actually teach Greek as a language, that’s a decision at the level of what’s important for seminary graduates, and a wrong one in my view.

  • There is a point to pursuing acquisition.

The third common objection that I hear and feel like rambling about today is that there is simply no point or value in developing a communicative ability in Greek. Honestly, I find this baffling. I would never feel like someone whose English corpus was limited to 20,000 Leagues under the sea, and their ability to understand it was limited to sentence diagramming and word by word glossing, was someone who ‘knew English’ and could reliably understand English-language texts. For every modern language we expect Acquisition, not Grammar-Knowledge. Ancient Languages are not categorically different.

  • We do ourselves and our students a disservice by perpetuating Grammar-Translation

The overwhelming consensus in Second Language Acquisition theory and applied linguistics is that G-T is a poor method, and it produces sub-standard results. It’s not best-practice, and we’re kidding ourselves if we think that it is. Continuing to teach generations of students Greek, Latin, insert-other-ancient-language-here via Grammar-Translation, when collectively we know better, is a dishonesty, and the cognitive dissonance should cause us mental discomfort. Demand something better from yourself and for your students.

“It depends…” – Some thoughts on translation

As I mature as both a reader of ancient texts and a teacher, I find myself saying a lot more, “It depends…”, as well as “You could translate it that way”, “That’s one way to render it”, and a lot of “it depends on the target audience of your translation.”

 

Most students who study Greek or Latin in a traditional program are taught to translate, and to translate in order to show their knowledge of the underlying grammar. That is, the goal of their translation is not ‘word for word accuracy’ or ‘literal rendering’, but ‘demonstration of grammatical knowledge in the target language’. Later on, they are told, you can use freer translation, but for the beginning stages we want to see that you know grammar like we know grammar.

Which, from that school’s philosophy, makes perfect sense. But we all know (we do all know) that translation is an intricate art and is always betrayal. Translation isn’t even simply a spectrum from ‘literal’ to ‘dynamic’ with some super-holy synthesis in the middle where the HCSB lives.

When we translate we are trying to convey something of the base text to something of the translated text. Usually that is ‘meaning’. But even meaning is a bit nebulous – do we want to preserve the meaning of words, or of phrases, or of clauses, or the gist of the whole passage, or sometimes the socio-communicative function of the text? It’s never simple.

Likewise, our translation can be familiarised, that is we can try and render elements of the socio-cultural context of the base text into immediately understandable analogues in the target language’s culture (for example, what do you do with the Good Shepherd in a culture that doesn’t have and never has had sheep?), or your translation can be alienised, preserving idioms and cultural references that won’t be immediately understandable to the reader and will require them to acquire new information about the base text’s culture, or even defamiliarised, as in taking elements of a text that are comfortable and familiar, say in an already existing translation version, and rendering them in a way that is jarring and dislocating so as to force the reader into a new act of reading.

Personally, the way I try and train readers of ancient texts is to focus as much as possible on getting htem to read what is right in front of them. Read the text “as it is”. When I bring this over to translation, my philosophy is “best represent the text as you can” – if it has ambiguities, try to render them ambiguously, if it has clarity, express that clarity, if it has foreignness, preserve the foreignness. I think of this as fidelity in translation, but I recognise that there are other ways to do that, and that even a single translator (myself!) translates differently for different contexts and purposes.

Poetry is a great place to test translation philosophy. If you accept Jakobson’s functions of language and even some modicum of structuralism, then poetry is a form of language in which the focus is on the actual code, the language used to mean is the focus. Poetry is language highlighting language (but not language talking about language, that’s the metalinguistic function!). Anyway, what do you translate in poetry? If you focus on meaning, you lose poetics, but if you focus on poetics, you must betray meaning! And even if you focus on poetics, you still face the difficult choices.

 

Say we’re translating classical Greek poetry into English. Do we choose an English verse form? Free verse? Alliteration? Metre? Even if you choose metre, you’re doing a ‘disservice’, since Greek metre is quantitative but English metre is stress-based. But doing so is also, and indeed has, created an English metrical traditional. Whereas the further back you go you see a more alliterative tradition in English. But if you translate into a contemporary poetic medium, you might end up with Free verse. And whatever you do, you are in fact creating as well as translating, and inevitably betraying. One could focus on meaning, but then you will betray the poetic function of the text. You have no choice but to fail! And yet translations succeed. That is the amazing thing about translations, that it’s actually possible.

 

What are your thoughts? How do you feel about translation and translation-philosophies?

Language as problem-solving

One way to look at language as a phenomenon is to realise that it solves problems.

That ‘problem’ is communication, and so language solves a set of problems that range from the most simplistic (I want you to give me a rock), to the incredibly complex (I want you to have the same knowledge of Hindu ontologies that I do). But whatever it is, it’s a problem for which language provides a tool to solve.

If you think of language this way, it also helps makes sense of why languages are so similar – they have to solve the same set of problem-types. There might be infinite ‘problems’, i.e. infinite purposes to which you can apply language, but they ultimately can be typed as a finite set of purposes or problems. And so we can classify language patterns across languages by what they solve.

For example, things that generally fall under the ‘Imperative’ label solve a fairly basic problem: how do I tell you I want you to do something? I use something that functions as an imperative.

This is a good example to talk about. Let’s say I’m sitting at dinner with you and I want you to pass the salt.

I might say, “Pass the salt”. We would call this, grammatically, an imperative. But English linguistic cultures generally don’t like plain imperatives.

We might instead say, “Please, pass the salt.” I don’t know what grammarians call ‘please’; we could analyse it historically and etymologically, and talk about, ‘if it please you’; or we could talk about it functionally and say that it softens the directness of the imperative.

Or we might say, “Would you please mind passing the salt”, in which case we have added a verb (“mind”) and a modal phrase (“would”) to make the imperative more indirect. Grammatically most people would not parse this out as an imperative. But it’s functionally an imperative.

Now, when you come to a different language, you can ask yourself this question, “How do I get someone to pass me the salt?”

Do you see how the problem gives you an acquisition tool? Or to rephrase, thinking of language like this allows you to set up situations to acquire language. Want to learn the imperative? Create situations where you can elicit it and where you can use it and be corrected.

Mongolian is an example close to my experience, and a good counter-example. It’s useless to translate something like, “Could you pass me the salt?”. You get something like Чи давс надад өнгөрөж болох уу? This is not even a proper sentence, because you can’t use ‘pass’ in that way. Nor something indirect like, “I would like the salt”. Even “I want to get the salt” is not appropriate, Би давс авмаар байна. In Mongolian this is an indicative statement that expresses the fact that you want the salt. There is no socio-linguistic sense that makes it an implied imperative. You are simply telling the other party about your volitional state.

To get the salt, you need to say something like давс аваад огооч, which translates as something like “take the salt and give [it]”. The combination of  “take, then give” is the proper expression for the English “pass”, and the -ооч ending there is one form of imperative.

This is on the simple side of examples, but I’m trying to illustrate that thinking of language as a set of ways to solve problems allows you to:

1. Ask the question, “How does my language solve problem X?” often instead of “how do I translate X?”

2. Set up and look for situations that elicit certain structures, by thinking through what problems there are.

Triple language overload

The Patrologist has been quiet lately because he’s in Mongolia teaching Ephesians in Greek. It’s really a test of one’s linguistic competencies to explain the intricacies of Pauline grammar and theology in a third language while your notes are in your first. But it doesn’t leave much time for blogging. Some semi-regular thoughts on all things Patrology will resume the following week. In the meantime, let me just say what’s up with the crazy thought process behind Ephesians 4:16! That sentence is all over the place.

How do you get an ancient language ‘right’?

You know, at the end of last week I put up that youtube video with an invitation to come and learn some Latin with me through CKI. I knew before I posted it that there would be errors, because I know that anytime I speak in a non-native language I generate errors and it is impossible for me to catch them all. Indeed, I even make errors in my native tongue, so it’s not really a surprise.

 

I could launch into a defense, self-deprecating or self-justifying, of the kinds of errors I made int hat video, or why, but I don’t really think that’s helpful. If you want to learn Latin through CKI, go and enrol. If not, don’t. If you have specific questions about my Latin or my credentials, you can write to me.

No, I want to talk about something else today, and that’s the issue of how do we know if we’re getting an ancient language ‘right’? If Communicative Greek, Latin, or any other now ‘classical’ language is going to be taught along these communicative lines, with an aim of “accurate speech that reflects the linguistic norms of a certain time period”, how do we go about that?

I was asked this a while back in person, and then I gave a double-barrelled answer, but I think one of those barrels can be elaborated on.

So, we don’t have a living, breathing, talking language community to norm our own speech behaviours. Therefore we must use other means:

  1. Explicit Grammar Check and Correct
  2. The Textual Corpus

Let’s talk about 1 first. We do have a pretty good grammatical knowledge of Latin (let’s stick to Latin in this post). So we can pre-check and post-check things we say/write. Pre-checking is what I try and do before a teaching session – I look at what I want to teach, I double check forms, patterns, usages, even vowel lengths and accents. I’m trying to make sure in advance that what I present is right. Post-checking is what happens afterwards. So many things might arise in the course of using the language, some of which I think I know, some of which I half know, some of which I’m less sure of. All of which I want to review later. It’s almost certain I make mistakes. Explicit checking allows us to go away, mark what wasn’t correct, and then correct it the next time around.

The second element is that we have a large textual corpus. It must stand in for a speaking community. But we can use it in several ways. One way is simply to be reading as much authentic material as we can manage. This exposes us to natural patterns of usage that we wouldn’t think of ourselves, and ingrains in us the phrases, idioms, structures of the language at both a grammatical and discourse level.

We can also use the corpus more explicitly. We can run analysis on words, forms, phrases, etc., to try and actively work out how some things might be expressed. This I do not really do enough of, it’s not really my area, but I still think it’s important.

We cannot become native Latinists, native Koine speakers, and in fact even if we achieve vaunted ‘fluency’ there is always more to be learnt, more mastery to be acquired. Learning to norm our own speech according to an objective norm is what helps us stay on track.

Figurative Language

I’ve been trying to figure out how to write about this for a while, and I don’t think I’ve got it worked out yet.

 

Firstly, what’s the gap between literal and figurative language? Literal language is generally thought of as ‘according to the proper meaning of a word’. Figurative language, then, is when words are applied not according to their ‘proper’ meaning.

Unfortunately words cross their ‘proper bounds’ all the time, and the figurative can become literal. Is referring to the ‘leg’ of a table a proper or a figurative usage? Was it perhaps once figurative but its usage has now become literal, so that within the bounds of this lexeme’s semantic domain one must include ‘one of several generally-vertical portions of a table used to elevate the horizontal portion off the ground’?

When it comes to the Bible, nobody reads literally. And calling oneself a ‘literalist’ is itself a figurative use of the word, because it usually means ‘someone who holds to a higher concept of inspiration/authority/etc. than that group of Others’. The attempt to distinguish between ‘literal’ and ‘literalistic’, while sometimes helpful, it itself this kind of exercise in Othering. Those who want to frame themselves as ‘literal’ refer to the genre of a text, the intention of an author, the ‘normal’ ways of meaning applied in language; i.e., the recognise that the Scriptures contain figurative usages.

Our real points of contention come when we try and work out where does Figurative language ‘start’, where does it ‘end’, and where is ‘too far’? Does Figurative interpretation have ‘proper limits’? This is the concern with ‘allegory’.

What is allegory? Typically, allegory is understood to be a figure of speech involving a sustained or extended metaphor (itself not a very helpful word, “a word or phrase applied to an object or action to which it does not normally refer, to create an implied likeness or comparison”) in which multiple points of comparison are made, transforming the original text into a highly symbolic or coded text.

Rather than giving you an example of allegorical interpretation of Scripture, I want to illustrate with an allegory used within Scripture:

Judges 9:8-15

“The trees were determined to go out and choose a king for themselves. They said to the olive tree, ‘Be our king!’  But the olive tree said to them, ‘I am not going to stop producing my oil, which is used to honor gods and men, just to sway above the other trees!’ “So the trees said to the fig tree, ‘You come and be our king!’ But the fig tree said to them, ‘I am not going to stop producing my sweet figs, my excellent fruit, just to sway above the other trees!’ “So the trees said to the grapevine, ‘You come and be our king!’ But the grapevine said to them, ‘I am not going to stop producing my wine, which makes gods and men so happy, just to sway above the other trees!’ “So all the trees said to the thornbush, ‘You come and be our king!’ The thornbush said to the trees, ‘If you really want to choose me as your king, then come along, find safety under my branches! Otherwise may fire blaze from the thornbush and consume the cedars of Lebanon!’ (NET Bible)

Actually, you probably need to read all of 1-20 to get the proper context, but this is an allegory that is easily understood. It’s an extended metaphor with multiple points of comparison, giving a vivid symbolic narrative. The allegory is controlled by the author – he gives us the points of comparison and the limits of the allegory itself, in vv16-20.

The problem, generally, with allegorical interpretation of Scripture is that it is uncontrolled because the text and its author have no scope for limiting the points of comparison or analogy.

Those who rail in favour of ‘literalism’ often do so against allegorical interpretation and especially its excesses and caricatures. But attacking the extremes of figural language is no more convincing than attacking literalism by constructing a straw man of ‘literalists’ who refuse to accept even the most common figures of speech.

I think we need to do ‘better’ in this area. We need a more thorough-going thoughtfulness about how figurative language works, and how it is bounded. I hope to keep exploring this in some future posts, which I will get to whenever I get to them. But feel free to add some thoughts and questions in the comments.

How many languages do you speak/know/etc? Acquaintancy vs. Competency vs. Proficiency vs. Fluency

It’s not infrequently that I get asked a version of the first question. It happened a week or so ago, a girl in my research office casually started conversation with, “So, how many language do you speak?”

My reaction varies a little based on the question, or more precisely the verb chosen in the question. Generally speaking, I don’t like to launch into a discourse on the difference between knowing/speaking/acquisition/etc.. That’s what this place is for!

I think one of the difficulties is that we have a Native Language concept that interfers or influences how we think about L2s. That is, we generally think we ‘know’ our L1(s), and treat them as a singular entity that is “known”. Even though, of course, Native speakers often don’t ‘know’ some things about their language, or make errors, or any number of things. Let’s not even get into the idea of idiolects and each language as an idealisation. We tend to think of Languages as Idealised Units, and knowledge as binary: you take a course of instruction and then you “know” TL. Even when we know this is wrong, we still have this tendency. We, ironically, need better vocabularies for talking about knowing Languages!

I like the term “Acquaintance” to mark any general knowledge about a language and an ability up through the beginner stages. It’s a useful tag for saying, “I’ve come into contact with TL (target Language), know a little bit about it and know a little bit of it.” It’s vague enough and humble enough to cover a wide range of levels below the rest. I would say that I’m acquainted with German, French, Italian, Spanish, and ‘well-acquainted’ with Biblical Hebrew.

Let’s skip forward to “fluency”. This is probably the most difficult term. It’s used for such a broad range of abilities. Bennie Lewis of Fi3M fame pins it as low as B2 (in his “Fluent in 3 Months, kindle loc. 674). I suspect most people think of it as higher than this, C1 at least. For most people, fluent means something close to Native-like: an ability to speak about a broad range of topics, in depth, without any errors that hinder communication. It’s, frankly, difficult to reach such a level for an L2, primarily because the sheer time to go from B2 to C1 and then to C2 is really quite vast, and requires a lot of time functioning in the L2. I rarely say I’m fluent in a language (except English!)

What’s between the two? I like to use the terms “competent” and “proficient”. Recently I’ve been reading Alice Omaggio’s Teaching Language in Context: Proficiency-Oriented Instruction, which is an interesting book for all sorts of reasons. Although these terms could be used interchangeably, or with various nuance, I treat ‘competent’ as lower than ‘proficient’; “competent” in my mind is someone who can understand the language without frequent miscommunication, and can manipulate the language to express what they want. It’s a lower-intermediate level. “Proficient” in my view is more what Bennie thinks of as “fluent” – they speaker has a mastery of competency, so to speak. They are able to discourse about a range of topics, and have socio-communicative strategies for managing when they are out of their depth. They are not near-Native, but they can function and survive in a wide array of TL settings, without needing to resort to their L1s.

Generally I would say I am competent in Gaelic. I would say I (was) proficient in Mongolian, though it’s probably slipping. Depending on audience I am usually happy to say I’m proficient in Ancient Greek and Latin. Primarily because, although my conversational skills are low, I am rarely called upon to speak in these languages, and my high level reading abilities mean I am equipped for what I do in them.

Of course, in an ideal language-learning world we would have unlocked all secrets and have a fast-track method for moving students rapidly from A1 to C2, from acquaintance to ‘fluency’. But we don’t, we just make up these labels to try and categorise a range of phenomena, our raw data on those times when we succeed or fail in communication, whether transmitting or receiving. What’s more important, in my view, is the simple principle of Language learning momentum, to keep moving forward rather than backward in one’s L2s.

 

Sometimes I just say 5 and move along with my day.

Software for ‘reading’ foreign language texts (2)

Okay, Let’s look at Learning with Texts, and how to get set up and going with it.

A lot of information in this post is totally derived from this post over at DIY Classics. I 100% tell you to go and read that post. My aim is to supplement that post by giving some more specific details and talking through some of the details of FLTR and how to use the two sites.

Your first port of call should be the LWT page, which is full of a lot of information, including what to do if you want to set up your own localised version, which involves running a server version of it. That’s not most of us, and so thankfully Benny of “Fluent in 3 Months” fame runs a free hosting service. So next stop is to go to his website:

http://www.fluentin3months.com/wp-login.php?action=register

and register a username and password, then use this to go to:

http://lwtfi3m.co/

and use it to log in.

For all the set-up details, follow the rest of that post as DIY Classics (though note point 5 below), here is the link again. But if you are feeling lazy and want to stay here, here’s a brief run-down:

  1. Click on my Languages
  2. Click on the Green plus sign or ‘New Language’ and add Latin/Greek
  3. For dictionary, you want to add:

Latin: http://www.perseus.tufts.edu/hopper/morph?la=la&l=###

Greek: http://www.perseus.tufts.edu/hopper/morph?la=gr&l=###

I would delete the Google Translate URI because it doesn’t exist for Ancient Greek, and isn’t good for Latin.

  1. For Character Substitutions: ´=’|`=’|’=’|’=’|…=…|..=‥|«=|»=
  2. Especially for Greek, you want:

RegExp Split Sentences:  .!?:;•

RegExp Word Characters: a-zA-ZÀ-ÖØ-öø-ȳͰ-Ͽἀ-ῼ

These two values make sure that LWT correctly works out (a) what starts and ends words, and (b) make sure it uses the Unicode set that includes polytonic Greek characters. Both are important in getting Greek to display and function properly. Notice that the RegExp Word Characters is different from what DIY Classics said; I found that didn’t work for me.

  1. Select your Language from the home screen
  2. Click “My Texts”
  3. Click “New Text”, and enter a title, paste in your text, and tag it as you please.

Okay, you should be all set up for Greek and or Latin. Once you’ve done that, go to “read”, it’s the first icon listed on the text page. You’ll get a screen with 4 components:

LWT 1

In the top left is your LWT menu. Notice that it lists 839 “To Do” – those are untagged/unknown words in the text. Next to it is an “I know all” button. Basically click this is you know every single word in a text and couldn’t be bothered tagging them.

Below this is the reading pane. In this is the text you’re working with. You can see in the first screen shot that I’ve left-clicked on ἀπόστολος, which has given me several options.

In the top right pane I’ve got the option to edit this term; it’s listed as a new term, and I can add in both a translation, as well as some tags, perhaps I would add: noun, masculine, nominative, singular. Romaniz is for a Romanisation of foreign alphabets. At the bottom is a coloured status bar: 1-5 of unknown to well-known, then “WKn” for actually so well known you don’t want to worry about it ever again, and “Ign” for Ignore this term, useful if your text contains non words or words not in your target language.

The bottom right pane that’s opened up is the dictionary look up, based on what you listed in Dictionary 1 in the settings. In this case, it’s gone to Perseus just like I told it too. The bottom frame just works like an inset webpage, so you can click through to the LSJ or Middle Liddell entry as you please.

 

Foreign Language Text Reader

I also want to talk about Foreign Language Text Reader (FLTR). FLTR is like a slimmer, maybe-dumber, version of LWT. But its two greatest strengths are that it’s on your computer (without running a server!) and that it’s super simple.

For FLTR head to https://sourceforge.net/projects/fltr/

Follow the download and installation instructions, they are pretty straightforward.

Open up the program and you’ll see a very basic interface.

First you’ll want to find the line of options that starts with Language, click on New, type in your language, say “Greek”, and then we want to edit the settings for the language.

Pretty much the settings are the same as for LWT, so here’s what I use for for Greek:

Char Substitutions: ´=’|`=’|’=’|’=’|…=…|..=‥|«=|»=

WordCharRegExp: a-zA-ZÀ-ÖØ-öø-ȳͰ-Ͽἀ-ῼ

DictionaryURL1: http://www.perseus.tufts.edu/hopper/morph?la=gr&l=###

You may want to come back and increase the text size later as well. I find the default is often too small.

Then you need to add some texts. Personally, I add them via this method:

Each language will have a subfolder wherever you installed FLTR, open this up, and you’ll find a folder like Greek_Texts; save a text file (*.txt) in UTF-8 format in here, and you can use it in FLTR. So, for example, grab a copy of say, 1 Peter, put it into a text editor, switch to UTF-8, and then save to here. Open up FLTR, select Greek, and then select the text. If it hasn’t appeared, click refresh and double-check for it in the right folder. Here’s a screen-shot of some Greek text open with FLTR:

FLTR Greek

You can see that I haven’t used it a lot with Greek, as most of the text is blue, which means a new word you haven’t tagged. Green words are well-known, Yellow words are level 4, pretty well known; the colours shade down to a red which is level 1-2 not very well known. A left click will bring up two pages:

FLTR Greek 2

Firstly it will bring up the editing screen for that word, and you can edit this with relevant information; it will also open in a web browser the linked dictionary. A right click on a word brings up a smaller dialogue box, and I can edit from there as well, without having it force open the web-browser/dictionary option. Here’s another screenshot with a short text about Bonnie Prince Charlie in Gaelic; you can see I’ve worked both with Gaelic and with this text before, because most of the text is coloured in.

FLTR Greek 3

What’s the point? Or, Pros and Cons

A student helpfully asked me how this was better/different to using, say, morphologically tagged Bible-software, a la Accordance, BibleWorks, Logos, etc..

  • It’s personalised, and testable. Every entry is put in by you, and so it’s filled with whatever you wanted to include.
  • It’s geared towards reading and familiarity. It doesn’t mindlessly tell me all the information for each word as I scroll my cursor across, it colour-codes to how well I know the word, and what information I have included.
  • It’s faster than doing this manually. Reader’s editions are great, writing on paper is great. This lets you tag your own texts digitally, and it saves those tags across languages, which is great when you’ve encountered a word once, and then find it again 3 years later in a difficult text.
  • It’s easy to work with the same interface across multiple languages. This is my preferred way of dealing with foreign texts. I use it for Gaelic, Mongolian, French, German, Italian, and am exploring its use for Greek and Latin.

 

Cons:

  • There’s no real way to do actual morphological tagging. So every inflection of ἀπόστολος is going to be a separate entry. LWT does nothing to alleviate this. FLTR does have a little drop down when you’re entering a new word that lists similar words, so if you have entered, say, a different form of the word, you can more or less copy what you had elsewhere. I suspect there isn’t an easy way to fix this, since you would need some way of teaching the software to do morphology for multiple user-inputted languages.
  • It’s slow to get started. Opening up 1 Peter and seeing 839 new words to tag, if you already have some experience in Koine, is not a thrilling experience, because this takes time. If you were starting from scratch in a language, it would be more rewarding. But if you’re already ‘on the way’, then it’s slow to get going. But it pays off. This week I opened up a new Gaelic text I’d never tackled, and at least 90% of words were already tagged. This is the pay-off.

Conclusion:

So that’s it. I’d be interested in your feedback, if you’ve had some experience or if you go and try it yourself now. Let me know if you have any difficulties in set-up or need a hand.

 

Tips and Advanced usage

Tip #1: You can select a string of words as a group; this is great if you want to tag a whole phrase that, for instance, might function idiomatically.

Tip #2: FLTR allows you to select “Vocabulary” as a text. This will let you filter a range of vocabulary by ‘knownness’, from a specific text or all texts, with a number of entries, sorted either alphabetically, by status, or random.

Tip #3: You can also access your FLTR vocab as two sets of files in your main FLTR directory, one is a plain text, say Greek_Export.txt, while the other will be a comma separated version, Greek_Words.csv. These files aren’t very useful to look at, but they are the same as the LWT TSV export, so you can actually move between the two programs.

Tip #4: You can import these exported files (from FLTR or LWT) into an Anki deck, if that’s how you like to operate.

Software for ‘reading’ foreign language texts (1)

So, particularly if you’ve come from a Greek/Latin/Classics background, most of what one is taught is how to do a lot of intensive reading. Intensive reading involves talking relatively small segments of texts and analysing, grammatically, each word and segment, mentally parsing and tagging things, and then understanding how the clause and sentence and paragraph fits together.

There’s a place for that. But it’s generally not the best way to move towards a more fluent reading approach. On the other hand, the grammar-translation approach almost never employs something at the other end of the spectrum, extensive reading (http://en.wikipedia.org/wiki/Extensive_reading). There’s lots of good material and research arguing for the efficacy of extensive reading.

One of the barriers for reading particularly classical languages is that there is simply not enough and not appropriate enough texts for reading. There aren’t really graded readers, Lingua Latina being a sole exception. There certainly aren’t extensive series of readers pitched at stable levels designed to move readers slowly and surely to greater proficiency and expand their vocabulary. And, a great gap is that there isn’t any YA literature. YA literature, I would say, is actually amazingly important; it’s language is just a step below ‘adult’ literature, but it’s interesting (usually) and engaging and adults can read it with genuine enjoyment while at a slightly lower language level.

Anyway, to read extensively in a way that works requires a fairly high level of comprehension. One needs to be recognising upwards of 90% of what’s going on in order to figure out the other 10% from context. Maybe more, maybe less. Certainly there’s a point at which there are two many unknowns and the reader gets lost.

So what if there are no texts suitable for your reading level? This is where I think some reading tools will help a great deal. Basically, we want to remove the barriers that slow reading down to the point of frustration. The main difficulty to be overcome is vocabulary – how do we raise our vocabulary to a level where more and more is comprehensible? What if we accelerated and integrated the ability to look up words, and if we made readily available on the same reading ‘page’ the meanings of every word we’ve ever encountered? That is the premise of the reading software I’ll be looking at in the next two posts. By recording and tagging *every single word*, you can see at a glance what you know, what you have previously encountered, and what you’ve never encountered.

If someone is mainly working with texts that have already been tagged to death, i.e. biblical texts, classical texts in the Perseus collection, then maybe something like this isn’t so necessary, but once you’re outside those corpora, essentially one is tagging one’s own texts. This is a digital way to do it across texts, instead of covering a sheet of paper in notes (I’m sure plenty of us have done that), or some ad hoc software solutions.

 

In my next post I’ll talk through Learning with Texts and Foreign Language Text Reader

Hunting with Where are your Keys?

I had an interesting (new) wayk experience over the weekend, and thought I would share it with you this week. Most of the time that I’m using WAYK it’s to try and communicate pieces of language to someone else. That is, I’m in the role of a teacher, and I’m trying to give bite-sized-pieces to a learner to assimilate and acquire language.

But that is only one side of the equation with wayk – you can equally use the techniques of wayk to obtain language when you’re a learner. Because ‘all’ wayk is is a set of techniques to facilitate and accelerate language acquisition. So over the weekend I sat down with a friend of mine, who is a non-native but fluent German speaker (former German teacher, lived in Germany, has a German wife). He was my ‘fluent-fool’ for the session.

I explained very briefly what I was going to do, and then set up a table. To be fair, I already had some vocabulary and grammar and initial sentences, and knowing “Was ist das?” probably really helped too, but other than that my German is really at a low level.

It was very interesting once we shifted into German. He quickly understood generally what I was trying to do, and specifically what I was trying to elicit. I did have a prop problem in that I didn’t set up red-pen/black-pen clearly, and so I could draw out adjectives from him. At the same time, the difference in types of pens that I had on the table launched him into a fully German explanation of different words for ‘pen’ in German!

One of the things I appreciated was that he mainly stayed in our target language, and would often go on in Deutsch for quite a while. I think in different contexts the learner would need to control this, but for me it was entertaining, and mostly comprehensible. When I wanted to bring it back to earth, I would just go back to constructing simple sentences to check ‘is this right?’

A couple of times he veered into English language grammar explanation. Again, for me this was fine, partly because I don’t mind learning grammar, partly because of the informality of our session. I think in a more focused series I would want to encourage him back into the German more frequently.

One of my reflections is that when you stay in the TL, and you’re trying to elicit certain structures, it really becomes apparent more quickly when things aren’t going to plan. I.e. you had a bad set-up, you can’t get the piece of language you want, and you can’t express what it is you’re trying to elicit. So much is about set-up, get it right and you’ll get the language you’re hunting.

Theory Friday: Flashcards

One of my sidelines this year is to tutor an hour a week for students who are tutoring other students in introductory Greek. It always seems complicated to explain that. We are only in our second week, and in fact the student-tutors have not yet commenced their tutoring of their students, so we have taken the opportunity to do a little bit of meta-thinking about language acquisition/teaching and methods. Of course, you know this is just the kind of thing I like to do.

What follows is a post-factum write-up of some of the things I covered.

 

This week we spent some time thinking and talking through Flashcards. Good old flashcards!

What is the quintessence of the flashcard, physical or digital? I take it, that it is the direct correspondence of one discrete unit of information with another. This is both the genius and the weakness of the flashcard approach. For, the flashcard can never get away from this 1:1 correspondence model, even when it becomes something like X:y,z,a,b,c – it is still operating on a correspondence model. At the same time, this segmentation and compartmentalisation is what allows it to work so well for massive rote-processing. Once we accept this limitation, we can think through two related questions:

  1. How do we mitigate the traditional weaknesses of flashcards?
  2. How do we complement the use of flashcards for better learning outcomes?

I’ve gone back and forth on flashcard use. I think that overall they are an inferior method of learning vocabulary in general. But they do have their uses, which is why I swing back to using them occasionally. Their advantage is that they allow massive rote learning of vocabulary by a relatively automatic process. This is very useful for initial stages, at which constructing materials or finding texts that allow high comprehension is difficult. This is one reason flashcards should, in general, be built on corpus frequency – high frequency vocab initially acquired by flashcards can rapidly be both solidified and nuanced by extensive comprehensible input.

The weaknesses of flashcards include some of the following:

  1. Encouraging a correspondence/translation approach to language
  2. Reinforce native-language thinking patterns
  3. Present words in a decontextualized manner
  4. Prioritise Visual-Textual learning processes
  5. Ineffective for structures

Point 1 is the most difficult, because of the quintessence of flashcards. I think it best to mitigate this through complementary approaches. Point 3 can be mitigated by including contextual information on one side of the ‘card’ – a sentence, clause, or even a phrase, can contextual new information in a way that provides more language-oriented material than just ‘mental factoid’ gloss. This sentence or phrase should be relatively simple, it should be something the learner can process without any great mental difficulty – i.e. they shouldn’t operate in a sentence beyond what the learner is already comfortable with and the rest of the words should be familiar and immediately understood.

Point 4 and point 2 can be mitigated in relatively complementary ways: by replacing native-language glosses with pictures, and/or using audio information. Pictures have two downsides: they require a very large ‘up-front’ cost in generating a deck with relevant pictures (I’m thinking digitally here), and specificity of pictures can be problematic. One has to think carefully how to use a picture to represent a concept in a way that is non-ambiguous. I’ve never seen an audio-to –text deck, but I think it would be brilliant, if one whole ‘side’ of the deck were just audio material. I suppose the extreme would be audio-to-pictures.

Point 5 has to do with things like untranslatable particles/modal markers, etc., things that need to be understood as part of a larger unit. In Greek ἄν is my default example. Practically useless on a flashcard. This can be mitigate by embedding these kinds of words/structures into sentence level units and highlighting/underling/otherwise marking the targeted information. The learner then is responding to some actual language use, while being reminded of the target information.

On to my second question, how to complement flash-cards. As I said earlier, flashcards are really like intense boot-camp for acquiring a basic vocabulary. Personally, I think they make best sense when used solo, when other materials are not available. I would complement it by carefully constructed graded reading material, which is going to establish contextually comprehensible input and increase reading proficiency, and verbal practice of some sort. Flashcards would then be used to pass your time on the bus or something, reinforcing this basic information in another form. Next time I’ll talk about intensive and extensive reading practice and gradation.

Would love to hear your thoughts/experiences/relevant research on the topic of flashcards.

Why People continue to teach via Grammar-Translation

Foreword: this is our last post for 2014. Enjoy your holidays and you’ll hear again from the Patrologist in 2015.

 

In this second of twin posts I’m exploring common ‘defeater’ reasons people give for sticking to GT as a method and rejecting approaches like Communicative Instruction. In each case I give a brief explanation of the belief, and some counter-points.

  1. It doesn’t work

I.e. applying CI or other modern language approaches to Classical Languages ‘doesn’t work’. I’m not sure this is a sincere objection, I suspect it’s rather of the order of ‘I don’t want to engage this idea and I’m blanketing you out’.

The fact that it has not worked, or is not pervasive, or that sometimes it doesn’t go perfectly, are not arguments against it. In fact it has worked and it does work. A few timely videos of those few individuals with a decent speaking facility in Latin or Greek shows that it is by all means possible. It is not just possible for the elite either, it is possible for all students.

  1. Dead languages are different

 

Not heard as often these days, but for quite some time people would say things like, “Latin is no longer spoken, therefore our method of learning must be different.” They were generally not making a comment on the difficulties of learning a language no longer spoken (i.e. lack of speakers to talk with) but asserting a fact about the nature of a no longer spoken language.

 

Which is absolute nonsense. Latin is not different from other languages insofar as it doesn’t have a speaking community (I don’t wish to debate whether it does have such a community at this time). Latin is a language. Which means it can be learnt as a language. The status of any language in regards to the number of speakers currently using it has zero bearing on whether it can be learnt as a language or must be learnt as a ‘something other’.

 

If, heavens forbid, all French speakers were wiped from the face of the earth tomorrow by some new, virulent, French-speaker-targeting super-virus, this would not alter the kind of language that French is. It would certainly create obstacles for anyone wishing to learn French. And, given their sudden fatality, I can’t imagine anyone rushing to do so, but French itself would not have changed.

 

So too with the classical languages: if they are languages, they may be learnt as such.

 

  1. It’s too hard

And remembering arcane rules of grammar that appear once every 10,000 words isn’t hard?

 

Yes, I would say, learning a language is hard. It’s hard to learn it as a spoken language, and it’s hard to do GT. All GT students know that! And all students who acquired an L2 as adults know that it was hard too. It took hours, it was tiring, it involved a lot of interaction with speakers, probably embarrassment, and there were many highs and lows and plateaus as well.

 

But it’s not harder. GT is not only hard, it’s often incredibly boring. CT is hard because it requires more investment, but it yields greater satisfaction, it’s more interesting, it’s more motivating. It’s far better to go home from a lesson of CT having interacted in the Target Language for 1-3 hours, and have one’s head swimming in the TL, than to go home after 1-3 hours of GT with one’s head full of “The wicked sailors gave roses to the good girls.”

 

 

  1. It takes too long

Basically this reason is saying that while CT might ‘work’, it is slower, takes longer, and in the end takes too long for the results it promises. Better, in their view, to stick with GT, which requires less hours and gets us ‘somewhere’ faster.

I am almost convinced this is a valid point. I’ve written several times about how many hours working with Comprehensible Input in the Target Language might be required to achieve decent levels of competency, and they are considerable. Anyone learning a modern L2 knows this. I think those invested in teaching classical languages need to be very up-front and honest that considerable time investment is necessary.

However, what I would say is this: I don’t think GT takes less hours to get to the same place. I think GT takes less hours because it teaches and achieves far less. For GT practitioners to achieve real reading fluency takes many, many hours, which is my contention under point 6. In this instance we should not compare apples and pears. Furthermore, from what I generally hear from school teachers using CI based instruction, their results outstrip traditional methods, especially when (a) they spent a little bit of time prepping their students for the kind of tests that traditional methods favour. If that little bit of prep time isn’t their, CI students often simply don’t understand the jargon of grammar questions. No wonder, since they didn’t need it.

So let’s hold off on conceding that GT is ‘faster’, because it may not be faster and it may not even be to the same destination.

  1. It doesn’t match our goals

 

What are ‘our’ goals? I think this is a really important question, or debate to have. Often it seems like the goal of classical language instruction is to do grammatical analysis, but I’m sure most people don’t actually think this is the goal. Isn’t the real goal to be able to understand, appreciate, interpret, texts in classical languages and so to discuss and engage their ideas and content? Isn’t ultimately the content not the form that interests us? And while content and form are never divorced, just as culture and language are inseparable, they are distinct things.

 

If our goal was to train grammarians, then grammar is what we ought to teach. There’s nothing wrong with being a grammarian, of English or of classical languages. And in fact, probably some people do want to study the grammar of ancient languages. We need those people! But that’s not the goal of most students, or of most programs.

 

CT approaches do match our goals, they drastically and desperately match our goals. The claim that no one needs to know how to order a latte in Koine is irrelevant. That’s not our goal either. The goal of CT is to produce competent users of the language with an active facility that enables reading and comprehension of texts in the target language without recourse to translation or grammatical analysis for the purpose of understanding. (Though translation and grammatical analysis may be done for other purposes).

 

  1. GT is how I learnt, so it works

 

People who end up as teachers of classical languages via GT are the 4%. That is, they are often the small minority for whom GT ‘clicks’, who ‘get it’, who enjoy it, while the rest of the cohort is destroyed by a war of attrition fought with boredom and irrelevance.

 

And some of these teachers get very, very good at Greek, Latin, what have you. Especially those that do doctoral programs that require epic amounts of reading of primary language material. But this is my hunch – it wasn’t GT that got them to that point, it was using GT to render those texts comprehensible, and having a huge exposure to comprehensible texts over time. It was Comprehensible Input that gave them competency in reading directly, and this was only indirectly the result of GT.

 

I could be wrong, but I could be right too. People whose primary discipline is classics or the like, who studied primarily via GT, and who achieve marked ‘fluency’ in reading ability, often have a pop- or folk- view of language acquisition that is poorly informed by research or SLA theory, and dominated by the insular views of their own discipline and experience.

 

Even if it did work for you, why should we stick to a method that works for the 4%? What about the 96%? What if we used methods that meant classical languages were learnable by all, not the self-selective and self-satisfied ‘elite’? Wouldn’t that open up the field for the simple ploughman in the field in a whole new way?

Why People teach via Grammar Translation

In twin posts I’m going to explore some of the reasons people teach classical languages (by which I mean Ancient Greek, Latin, and similar languages that are mostly no-longer spoken and primarily of academic or historical interest) via the Grammar-Translation method (i.e. teaching grammar explicitly and training students to translate into their native tongue for the purpose of understanding). The second post will follow up on this one and tackle some issues more directly.

 

  1. That’s how the Ancients did it

 

People often think that ancient students of foreign languages learnt primarily via Grammar Translation. I think this is incorrect. Firstly, it’s often prejudiced by the fact that the Rhetoric-based education system of Greece, then Rome, included explicit grammar instruction as the fundamental stage of language and literature study. However, this does not mean those students learn either their L1, or their L2s really, via that grammar instruction. in the case of upper-class diglossia among Romans, who often spoke Greek quite well, this should be tempered by the very fact of that diglossia – they had a living Greek-speaking community that they were being initiated into.

 

  1. That’s how we’ve (‘Classics’) always done it

 

Again, largely untrue. This time for two reasons. I recommend anyone interested in this to read two books, Waquet’s Latin: Or, The Empire of the Sign which deals in part with Latin’s socio-cultural place in the 17th and 18th century, and explicitly talks about shifts in pedagogical practices. Secondly, James Turner’s Philology: The forgotten Origins of the Modern Humanities which discusses in depth how, in the Anglophone world, the practice of Philology ‘disciplinised’ into modern humanities, including classics, especially from 1850 onwards. ‘Classics’ as a distinct discipline along the lines we know it today, did not exist before then, and the revered pedagogical practices that dominate it often go back no more than 200 years, or less.

 

  1. That’s how my teacher did it

 

The path of least resistance for teachers is generally to teach what they know, how they learnt it. For many, myself included, Second Language Teaching (SLT) was explicit grammar instruction paired with translation exercises. Regardless of beliefs about SLA, the pressures of teaching often ‘push’ us to simply teach with what is ‘easiest’, and what is easiest in many classes is to pull out a textbook and replicate our own teachers.

 

  1. That’s what worked for me

 

Those that teach classical languages, no mistake, are often those who did really well at them. And with a culmination of points 1-3 the self-fulfilling elitism can be deafening.

 

In a recent discussion relating to why certain advocates of ditching G-T were so down on G-T, someone helpfully pointed out to a newcomer that all the people in the discussion who were down on G-T were those who had been very successful at G-T. This argument isn’t, generally, coming from those who failed because of G-T, but those who succeeded at the 4% method, and have come to consider it deeply flawed. Just because it worked for you, doesn’t mean it is a viable methodology in general. Indeed, the self-selection involved in ‘it worked for me’ actually really means, “it will work for people like me and that the only type of student I care about”.

 

  1. That’s what our goal is.

 

I’m going to tackle this much more thoroughly in the next post on this question, but some people think G-T achieves the kinds of goals we want in these disciplines. What does G-T achieve? It produces Grammarians and it produces Translators. Those are two good things, but is that the goal of classics and related disciplines?

 

One of the problems is that grammarians often try and do linguistics, and when they do it’s usually second-rate linguistics because they’re grammarians. The problem with translators is that they learnt to translate from a language they’re not competent in, instead of achieving competency first and then learning the art of translation. Meanwhile, don’t we actually want to train people as things like historians, litterateurs, theologians?

Exegesis as Reading

A little while ago there was an exchange that started on B-Greek, a place I generally read but do not interact, about Exegesis. Then there were a few blog posts, one by D. Streett, one by B. Hofstetter.

I often deliberately don’t engage in discussions that I don’t have time for, but obviously I have opinions. Particularly outrageous, bombastic ones. So part of my heart warms when R. Buth writes, “This is why I define “exegesis” as learning to extract meaning from a language that one does not control. “

Somewhat like Barry, I had already done a degree that was virtually Literature studies, as well as started down the Classics track, and taught myself the fundamentals of Koine Greek, before I got to seminary. One of the reasons “exegesis” is so problematic is that Biblical Studies got hived off, with so many other humanities disciplines, into a discrete ‘discipline’ about 200-150 years ago. On this regard, see the recent volume by James Turner, Philology: The Forgotten Origins of the Modern Humanities, which among other things does a good job of explaining how humanities ‘disciplinised’.

Exegesis as practised by most biblical studies students is a process of analysis and interpretation of a text at a fine-detail level: grammatical, lexical, syntactical analysis of words, phrases, verses, put together at the level of a paragraph. It rarely moves beyond paragraph level. It is, as Buth points out, often done by students/scholars who actually have no “control”, that is no genuine active competency, in the target language.

Let’s just stop and say, “That’s odd.” We would find that incredibly odd for a modern foreign language student. “Oh, you can’t speak a word of French, but you can analyse to death the syntactic choices of individual sentences in Camus’ The Plague?”

I want to be really clear here: there is a place for fine-detailed studies of grammatical, lexical, syntactical elements of small units. Everyone else calls this linguistics. And many, many of these ‘questions’ disappear, or better yet are disambiguated by a genuine competency in the language.

Take a step back – what is the purpose or goal of exegesis? To acquire a better understanding of the meaning of the text. We can call this ‘exegesis’, but we may as well call it ‘reading’, though we must keep in mind that ‘reading’ here actually means something like ‘interpreting’, i.e. we are engaged in attentive, analytical reading at micro and macro levels. Or, “literary criticism”.

I don’t think calling it ‘reading’ is always helpful. In my school there were some teachers who used to say nonsense like “We don’t interpret the Bible, we just read it.” Which was always doctrinal dribble based on a claim to avoid the theological difficulties that the very idea of ‘interpretation’ generates. No, that won’t do. Reading itself is an interpretive act.

On the other hand, reading here is a higher order activity than mere reading. And it really must go beyond the sentence level. Unless you can get to a level of discussing a whole text – a whole book, then you are missing the integrity of the text and cannot complete your reading. That’s why discourse analysis, or just plain literary criticism, needs to work at the macro-level.

To wrap up, I am constantly amazed to interact with so called ‘critical scholars’ who look at, say, a book like John’s Gospel and see nothing but a pastiche of cut-up pieces that represent a proto-Gnostic text re-edited by a proto-Orthodox edited then re-edited again. Why do they see only that? It’s because they analyse a painting by looking at each blob of paint from a stroke of the brush and consider it a different source. They never step back and see the artistry. Whether they are right or wrong is irrelevant to the fact that they can’t step back and look at the whole, can’t discuss the meaning of the book, can’t discuss themes, genre, art, motifs. Because they can’t decide which of 400 types of genitives the proto-Gnostic redactor meant, and their competency in the language is like a tourist who got off the plane with an antique reference grammar of the language and nothing else.

Why I do Sub-Optimal Language Exercises

Why bother doing anything but the best types of language acquisition activities?

I’m a firm believer in Comprehensible Input, and fairly sold on Krashen et al., that CI is the key to language acquisition. I don’t quite buy Krashen’s “strong” version that nothing but CI is necessary, because I think he’s framing the question a little incorrectly. Krashen these days makes a strong claim that CI, only CI, is sufficient by itself for language acquisition. I think this might be true, but there are other aspects of language competency that are perhaps not quite ‘acquisition’. The ability to speak, write, produce output is probably a secondary outcome of acquisition, but in my view and experience one still needs some practice in these output skills in order to actually output.

Anyway, I do all sorts of activities that are not optimal CI activities. I read texts too difficult for me. I do ‘composition’ exercises that are really translation exercises of banal sentences from English to Greek/Latin. Lately I have been working on an idiosyncratic but modern translation of the New Testament (I’ll write more about that individually later on). Why? Why waste time?

  1. Don’t wait for the best.

There is no way to get optimal CI in Greek or Latin. There’s no language community, there’s no children’s cartoons, there’s no 5 levels of graded readers about contemporary society, there’s no young adult extensive reading materials available. One will never derive enough genuine CI from currently available resources.

  1. Output exercises are nonetheless moderately useful.

Because (a) they develop output automaticity, even if no new language is being acquired. And because (b) the process of doing the exercises does involve some CI even if suboptimal.

  1. The art of translation is itself an art to be acquired.

While it’s generally and genuinely preferable, in my view, to work mentally in the target language, there are times when one will want to translate – in either direction. There are structures of phrasing and thought that come to one naturally, and in the absence of knowing a target language structure, you tend to code switch or break thought. Working systematically to acquire some of these structures will improve translation ability.

  1. For others

I think a previous generation thought you acquired language competency largely by suffering and toil. They were wrong about that, but using sub-optimal methods requires suffering and toil because the amount of time required to get the same amount of genuine CI is so much more. The only way we will produce teachers who are competent enough to utilise more-optimal methods is if we have teachers who are prepared to suffer a little to acquire by the hard way, and generous enough to pass that on by an easier way.

Like a broken record

Q: Patrologist, why do you talk so endlessly about language acquisition?

A: Because our field is so broken. In no other field do so many people who know their target language so poorly talk with such authority. I honestly wish it wasn’t necessary, that we rather lived in a time, an age, a place, where we took for granted that people who studied ancient Greek literature knew ancient Greek, where people learned in Hebrew had learned Hebrew, where scholars of Latin had been schooled in Latin. But we do not live in such a mythical land, we live in its counterfeit where people peddle outdated methodologies to reach inadequate heights.

I believe this is changing, but slowly, and only because some are agitating – pointing out that the Emperor does indeed have no clothes. You can try it at home – approach a Greek professor or a NT one or whatever, and initiate a Greek language conversation. If you don’t get a quick χαῖρε, ὦ μαθητά, πῶς ἔχεις σήμερον; then there really is something wrong.

On the flipside, all I am saying is that we apply Best Practices from contemporary Second Language Acquisition to classical and biblical studies. This should be the least controversial thing in the world. And all I am discussing is how we can do that. There is a long road ahead of us. That’s why I keep talking about the same things over and over. Until the revolution comes.