Semi-regular rant on Greek language pedagogy

(I’m mostly in the midst of doing a lot of thesis writing, but thought I could take some time out to ride a hobby horse).

  • Knowing a language isn’t a qualification for teaching a language.

We usually think that knowing something is a pre-requisite for teaching it, and generally that’s true. But it’s also not a sufficient pre-requisite. Plenty of people know skills or competencies which they do not have the ability to teach very well. This is why teachers get trained. So they know (a) how to teach as well as (b) the material they will teach.

Why would you think a language was any different? Monoglots Anglophones are particularly susceptible to this delusion: “Oh, you know Spanish, teach so-an-so.” If you’re a monoglot L1 English speaker, have you tried to teach English? It’s not that easy.

Why then do we think that merely being a successful student of Greek or Latin or X-language turns one into a qualified teacher of the same?

  • Having a PhD in Greek linguistics or in New Testament studies indicates almost nothing about how well you can teach Greek.

Most seminaries use their New Testament faculty to teach Greek, on the theory that they’ve studied a lot of Greek and did PhDs with Greek. But following on from point 1, this is only incidentally related to knowing how to teach Greek. This guarantees that the methodologies used in seminary-based education for Greek will continue to passively reproduce ‘they way I was taught’ from generation to generation. Which is not best-practice in the field at all.

  • Knowing a language and knowing about a language are two fundamentally separate things.

Anyone who gets to the end of a grammar-translation based program ought to realise this. Knowing about a language – whether in the terminology of (traditional) grammars or in the jargon of the discipline of linguistics, is not the same as possessing a communicative ability in the language to read/write/listen/speak directly in the language. They are two separate things, and they are acquired separately. Most speakers of an L1 do not develop any significant ability to speak about the grammar of their own language, unless taught it explicitly and formally. Students whose primarily educational content is a grammatical description of their target language should end up with an ability to analyse and interpret it, but any genuine acquisition of the language is incidental, and sometimes accidental.

  • The cost of pursuing acquisition doesn’t mean surrendering analysis.

One of the arguments I most commonly hear against communicative-based approaches to language acquisition for languages such as Greek is that it means students will not learn to do the kind of linguistic analysis that is currently taught. That would only be true if a program were designed exclusively to provide language acquisition and deliberately avoided any meta-language discussion. There is no intrinsic reason why students could not be taught meta-language skills in addition to actual language acquisition. Nor, if we are honest, would it be that problematic or time-consuming to teach them to do so.

  • The cost of pursuing acquisition doesn’t mean “too long, too slow, too little.”

Another of the objections I commonly hear, is that while communicative-based approaches may be possible, they would take too long and too much time to reach their destination, time which programs and students don’t have. To which I have several replies. Firstly, this is largely untested for classical languages – there are so few programs running full-blown communicative-based pedagogies that evaluating whether it actually takes too long is not seriously possible. Assuming that it would is bad research methodology. Secondly, I suspect this is not a concern at the pedagogy of language level, but at the curriculum design of seminaries level. If students and programs don’t have time to actually teach Greek as a language, that’s a decision at the level of what’s important for seminary graduates, and a wrong one in my view.

  • There is a point to pursuing acquisition.

The third common objection that I hear and feel like rambling about today is that there is simply no point or value in developing a communicative ability in Greek. Honestly, I find this baffling. I would never feel like someone whose English corpus was limited to 20,000 Leagues under the sea, and their ability to understand it was limited to sentence diagramming and word by word glossing, was someone who ‘knew English’ and could reliably understand English-language texts. For every modern language we expect Acquisition, not Grammar-Knowledge. Ancient Languages are not categorically different.

  • We do ourselves and our students a disservice by perpetuating Grammar-Translation

The overwhelming consensus in Second Language Acquisition theory and applied linguistics is that G-T is a poor method, and it produces sub-standard results. It’s not best-practice, and we’re kidding ourselves if we think that it is. Continuing to teach generations of students Greek, Latin, insert-other-ancient-language-here via Grammar-Translation, when collectively we know better, is a dishonesty, and the cognitive dissonance should cause us mental discomfort. Demand something better from yourself and for your students.

“It depends…” – Some thoughts on translation

As I mature as both a reader of ancient texts and a teacher, I find myself saying a lot more, “It depends…”, as well as “You could translate it that way”, “That’s one way to render it”, and a lot of “it depends on the target audience of your translation.”

 

Most students who study Greek or Latin in a traditional program are taught to translate, and to translate in order to show their knowledge of the underlying grammar. That is, the goal of their translation is not ‘word for word accuracy’ or ‘literal rendering’, but ‘demonstration of grammatical knowledge in the target language’. Later on, they are told, you can use freer translation, but for the beginning stages we want to see that you know grammar like we know grammar.

Which, from that school’s philosophy, makes perfect sense. But we all know (we do all know) that translation is an intricate art and is always betrayal. Translation isn’t even simply a spectrum from ‘literal’ to ‘dynamic’ with some super-holy synthesis in the middle where the HCSB lives.

When we translate we are trying to convey something of the base text to something of the translated text. Usually that is ‘meaning’. But even meaning is a bit nebulous – do we want to preserve the meaning of words, or of phrases, or of clauses, or the gist of the whole passage, or sometimes the socio-communicative function of the text? It’s never simple.

Likewise, our translation can be familiarised, that is we can try and render elements of the socio-cultural context of the base text into immediately understandable analogues in the target language’s culture (for example, what do you do with the Good Shepherd in a culture that doesn’t have and never has had sheep?), or your translation can be alienised, preserving idioms and cultural references that won’t be immediately understandable to the reader and will require them to acquire new information about the base text’s culture, or even defamiliarised, as in taking elements of a text that are comfortable and familiar, say in an already existing translation version, and rendering them in a way that is jarring and dislocating so as to force the reader into a new act of reading.

Personally, the way I try and train readers of ancient texts is to focus as much as possible on getting htem to read what is right in front of them. Read the text “as it is”. When I bring this over to translation, my philosophy is “best represent the text as you can” – if it has ambiguities, try to render them ambiguously, if it has clarity, express that clarity, if it has foreignness, preserve the foreignness. I think of this as fidelity in translation, but I recognise that there are other ways to do that, and that even a single translator (myself!) translates differently for different contexts and purposes.

Poetry is a great place to test translation philosophy. If you accept Jakobson’s functions of language and even some modicum of structuralism, then poetry is a form of language in which the focus is on the actual code, the language used to mean is the focus. Poetry is language highlighting language (but not language talking about language, that’s the metalinguistic function!). Anyway, what do you translate in poetry? If you focus on meaning, you lose poetics, but if you focus on poetics, you must betray meaning! And even if you focus on poetics, you still face the difficult choices.

 

Say we’re translating classical Greek poetry into English. Do we choose an English verse form? Free verse? Alliteration? Metre? Even if you choose metre, you’re doing a ‘disservice’, since Greek metre is quantitative but English metre is stress-based. But doing so is also, and indeed has, created an English metrical traditional. Whereas the further back you go you see a more alliterative tradition in English. But if you translate into a contemporary poetic medium, you might end up with Free verse. And whatever you do, you are in fact creating as well as translating, and inevitably betraying. One could focus on meaning, but then you will betray the poetic function of the text. You have no choice but to fail! And yet translations succeed. That is the amazing thing about translations, that it’s actually possible.

 

What are your thoughts? How do you feel about translation and translation-philosophies?

Language as problem-solving

One way to look at language as a phenomenon is to realise that it solves problems.

That ‘problem’ is communication, and so language solves a set of problems that range from the most simplistic (I want you to give me a rock), to the incredibly complex (I want you to have the same knowledge of Hindu ontologies that I do). But whatever it is, it’s a problem for which language provides a tool to solve.

If you think of language this way, it also helps makes sense of why languages are so similar – they have to solve the same set of problem-types. There might be infinite ‘problems’, i.e. infinite purposes to which you can apply language, but they ultimately can be typed as a finite set of purposes or problems. And so we can classify language patterns across languages by what they solve.

For example, things that generally fall under the ‘Imperative’ label solve a fairly basic problem: how do I tell you I want you to do something? I use something that functions as an imperative.

This is a good example to talk about. Let’s say I’m sitting at dinner with you and I want you to pass the salt.

I might say, “Pass the salt”. We would call this, grammatically, an imperative. But English linguistic cultures generally don’t like plain imperatives.

We might instead say, “Please, pass the salt.” I don’t know what grammarians call ‘please’; we could analyse it historically and etymologically, and talk about, ‘if it please you’; or we could talk about it functionally and say that it softens the directness of the imperative.

Or we might say, “Would you please mind passing the salt”, in which case we have added a verb (“mind”) and a modal phrase (“would”) to make the imperative more indirect. Grammatically most people would not parse this out as an imperative. But it’s functionally an imperative.

Now, when you come to a different language, you can ask yourself this question, “How do I get someone to pass me the salt?”

Do you see how the problem gives you an acquisition tool? Or to rephrase, thinking of language like this allows you to set up situations to acquire language. Want to learn the imperative? Create situations where you can elicit it and where you can use it and be corrected.

Mongolian is an example close to my experience, and a good counter-example. It’s useless to translate something like, “Could you pass me the salt?”. You get something like Чи давс надад өнгөрөж болох уу? This is not even a proper sentence, because you can’t use ‘pass’ in that way. Nor something indirect like, “I would like the salt”. Even “I want to get the salt” is not appropriate, Би давс авмаар байна. In Mongolian this is an indicative statement that expresses the fact that you want the salt. There is no socio-linguistic sense that makes it an implied imperative. You are simply telling the other party about your volitional state.

To get the salt, you need to say something like давс аваад огооч, which translates as something like “take the salt and give [it]”. The combination of  “take, then give” is the proper expression for the English “pass”, and the -ооч ending there is one form of imperative.

This is on the simple side of examples, but I’m trying to illustrate that thinking of language as a set of ways to solve problems allows you to:

1. Ask the question, “How does my language solve problem X?” often instead of “how do I translate X?”

2. Set up and look for situations that elicit certain structures, by thinking through what problems there are.

Triple language overload

The Patrologist has been quiet lately because he’s in Mongolia teaching Ephesians in Greek. It’s really a test of one’s linguistic competencies to explain the intricacies of Pauline grammar and theology in a third language while your notes are in your first. But it doesn’t leave much time for blogging. Some semi-regular thoughts on all things Patrology will resume the following week. In the meantime, let me just say what’s up with the crazy thought process behind Ephesians 4:16! That sentence is all over the place.

How do you get an ancient language ‘right’?

You know, at the end of last week I put up that youtube video with an invitation to come and learn some Latin with me through CKI. I knew before I posted it that there would be errors, because I know that anytime I speak in a non-native language I generate errors and it is impossible for me to catch them all. Indeed, I even make errors in my native tongue, so it’s not really a surprise.

 

I could launch into a defense, self-deprecating or self-justifying, of the kinds of errors I made int hat video, or why, but I don’t really think that’s helpful. If you want to learn Latin through CKI, go and enrol. If not, don’t. If you have specific questions about my Latin or my credentials, you can write to me.

No, I want to talk about something else today, and that’s the issue of how do we know if we’re getting an ancient language ‘right’? If Communicative Greek, Latin, or any other now ‘classical’ language is going to be taught along these communicative lines, with an aim of “accurate speech that reflects the linguistic norms of a certain time period”, how do we go about that?

I was asked this a while back in person, and then I gave a double-barrelled answer, but I think one of those barrels can be elaborated on.

So, we don’t have a living, breathing, talking language community to norm our own speech behaviours. Therefore we must use other means:

  1. Explicit Grammar Check and Correct
  2. The Textual Corpus

Let’s talk about 1 first. We do have a pretty good grammatical knowledge of Latin (let’s stick to Latin in this post). So we can pre-check and post-check things we say/write. Pre-checking is what I try and do before a teaching session – I look at what I want to teach, I double check forms, patterns, usages, even vowel lengths and accents. I’m trying to make sure in advance that what I present is right. Post-checking is what happens afterwards. So many things might arise in the course of using the language, some of which I think I know, some of which I half know, some of which I’m less sure of. All of which I want to review later. It’s almost certain I make mistakes. Explicit checking allows us to go away, mark what wasn’t correct, and then correct it the next time around.

The second element is that we have a large textual corpus. It must stand in for a speaking community. But we can use it in several ways. One way is simply to be reading as much authentic material as we can manage. This exposes us to natural patterns of usage that we wouldn’t think of ourselves, and ingrains in us the phrases, idioms, structures of the language at both a grammatical and discourse level.

We can also use the corpus more explicitly. We can run analysis on words, forms, phrases, etc., to try and actively work out how some things might be expressed. This I do not really do enough of, it’s not really my area, but I still think it’s important.

We cannot become native Latinists, native Koine speakers, and in fact even if we achieve vaunted ‘fluency’ there is always more to be learnt, more mastery to be acquired. Learning to norm our own speech according to an objective norm is what helps us stay on track.

Figurative Language

I’ve been trying to figure out how to write about this for a while, and I don’t think I’ve got it worked out yet.

 

Firstly, what’s the gap between literal and figurative language? Literal language is generally thought of as ‘according to the proper meaning of a word’. Figurative language, then, is when words are applied not according to their ‘proper’ meaning.

Unfortunately words cross their ‘proper bounds’ all the time, and the figurative can become literal. Is referring to the ‘leg’ of a table a proper or a figurative usage? Was it perhaps once figurative but its usage has now become literal, so that within the bounds of this lexeme’s semantic domain one must include ‘one of several generally-vertical portions of a table used to elevate the horizontal portion off the ground’?

When it comes to the Bible, nobody reads literally. And calling oneself a ‘literalist’ is itself a figurative use of the word, because it usually means ‘someone who holds to a higher concept of inspiration/authority/etc. than that group of Others’. The attempt to distinguish between ‘literal’ and ‘literalistic’, while sometimes helpful, it itself this kind of exercise in Othering. Those who want to frame themselves as ‘literal’ refer to the genre of a text, the intention of an author, the ‘normal’ ways of meaning applied in language; i.e., the recognise that the Scriptures contain figurative usages.

Our real points of contention come when we try and work out where does Figurative language ‘start’, where does it ‘end’, and where is ‘too far’? Does Figurative interpretation have ‘proper limits’? This is the concern with ‘allegory’.

What is allegory? Typically, allegory is understood to be a figure of speech involving a sustained or extended metaphor (itself not a very helpful word, “a word or phrase applied to an object or action to which it does not normally refer, to create an implied likeness or comparison”) in which multiple points of comparison are made, transforming the original text into a highly symbolic or coded text.

Rather than giving you an example of allegorical interpretation of Scripture, I want to illustrate with an allegory used within Scripture:

Judges 9:8-15

“The trees were determined to go out and choose a king for themselves. They said to the olive tree, ‘Be our king!’  But the olive tree said to them, ‘I am not going to stop producing my oil, which is used to honor gods and men, just to sway above the other trees!’ “So the trees said to the fig tree, ‘You come and be our king!’ But the fig tree said to them, ‘I am not going to stop producing my sweet figs, my excellent fruit, just to sway above the other trees!’ “So the trees said to the grapevine, ‘You come and be our king!’ But the grapevine said to them, ‘I am not going to stop producing my wine, which makes gods and men so happy, just to sway above the other trees!’ “So all the trees said to the thornbush, ‘You come and be our king!’ The thornbush said to the trees, ‘If you really want to choose me as your king, then come along, find safety under my branches! Otherwise may fire blaze from the thornbush and consume the cedars of Lebanon!’ (NET Bible)

Actually, you probably need to read all of 1-20 to get the proper context, but this is an allegory that is easily understood. It’s an extended metaphor with multiple points of comparison, giving a vivid symbolic narrative. The allegory is controlled by the author – he gives us the points of comparison and the limits of the allegory itself, in vv16-20.

The problem, generally, with allegorical interpretation of Scripture is that it is uncontrolled because the text and its author have no scope for limiting the points of comparison or analogy.

Those who rail in favour of ‘literalism’ often do so against allegorical interpretation and especially its excesses and caricatures. But attacking the extremes of figural language is no more convincing than attacking literalism by constructing a straw man of ‘literalists’ who refuse to accept even the most common figures of speech.

I think we need to do ‘better’ in this area. We need a more thorough-going thoughtfulness about how figurative language works, and how it is bounded. I hope to keep exploring this in some future posts, which I will get to whenever I get to them. But feel free to add some thoughts and questions in the comments.

How many languages do you speak/know/etc? Acquaintancy vs. Competency vs. Proficiency vs. Fluency

It’s not infrequently that I get asked a version of the first question. It happened a week or so ago, a girl in my research office casually started conversation with, “So, how many language do you speak?”

My reaction varies a little based on the question, or more precisely the verb chosen in the question. Generally speaking, I don’t like to launch into a discourse on the difference between knowing/speaking/acquisition/etc.. That’s what this place is for!

I think one of the difficulties is that we have a Native Language concept that interfers or influences how we think about L2s. That is, we generally think we ‘know’ our L1(s), and treat them as a singular entity that is “known”. Even though, of course, Native speakers often don’t ‘know’ some things about their language, or make errors, or any number of things. Let’s not even get into the idea of idiolects and each language as an idealisation. We tend to think of Languages as Idealised Units, and knowledge as binary: you take a course of instruction and then you “know” TL. Even when we know this is wrong, we still have this tendency. We, ironically, need better vocabularies for talking about knowing Languages!

I like the term “Acquaintance” to mark any general knowledge about a language and an ability up through the beginner stages. It’s a useful tag for saying, “I’ve come into contact with TL (target Language), know a little bit about it and know a little bit of it.” It’s vague enough and humble enough to cover a wide range of levels below the rest. I would say that I’m acquainted with German, French, Italian, Spanish, and ‘well-acquainted’ with Biblical Hebrew.

Let’s skip forward to “fluency”. This is probably the most difficult term. It’s used for such a broad range of abilities. Bennie Lewis of Fi3M fame pins it as low as B2 (in his “Fluent in 3 Months, kindle loc. 674). I suspect most people think of it as higher than this, C1 at least. For most people, fluent means something close to Native-like: an ability to speak about a broad range of topics, in depth, without any errors that hinder communication. It’s, frankly, difficult to reach such a level for an L2, primarily because the sheer time to go from B2 to C1 and then to C2 is really quite vast, and requires a lot of time functioning in the L2. I rarely say I’m fluent in a language (except English!)

What’s between the two? I like to use the terms “competent” and “proficient”. Recently I’ve been reading Alice Omaggio’s Teaching Language in Context: Proficiency-Oriented Instruction, which is an interesting book for all sorts of reasons. Although these terms could be used interchangeably, or with various nuance, I treat ‘competent’ as lower than ‘proficient’; “competent” in my mind is someone who can understand the language without frequent miscommunication, and can manipulate the language to express what they want. It’s a lower-intermediate level. “Proficient” in my view is more what Bennie thinks of as “fluent” – they speaker has a mastery of competency, so to speak. They are able to discourse about a range of topics, and have socio-communicative strategies for managing when they are out of their depth. They are not near-Native, but they can function and survive in a wide array of TL settings, without needing to resort to their L1s.

Generally I would say I am competent in Gaelic. I would say I (was) proficient in Mongolian, though it’s probably slipping. Depending on audience I am usually happy to say I’m proficient in Ancient Greek and Latin. Primarily because, although my conversational skills are low, I am rarely called upon to speak in these languages, and my high level reading abilities mean I am equipped for what I do in them.

Of course, in an ideal language-learning world we would have unlocked all secrets and have a fast-track method for moving students rapidly from A1 to C2, from acquaintance to ‘fluency’. But we don’t, we just make up these labels to try and categorise a range of phenomena, our raw data on those times when we succeed or fail in communication, whether transmitting or receiving. What’s more important, in my view, is the simple principle of Language learning momentum, to keep moving forward rather than backward in one’s L2s.

 

Sometimes I just say 5 and move along with my day.

Software for ‘reading’ foreign language texts (2)

Okay, Let’s look at Learning with Texts, and how to get set up and going with it.

A lot of information in this post is totally derived from this post over at DIY Classics. I 100% tell you to go and read that post. My aim is to supplement that post by giving some more specific details and talking through some of the details of FLTR and how to use the two sites.

Your first port of call should be the LWT page, which is full of a lot of information, including what to do if you want to set up your own localised version, which involves running a server version of it. That’s not most of us, and so thankfully Benny of “Fluent in 3 Months” fame runs a free hosting service. So next stop is to go to his website:

http://www.fluentin3months.com/wp-login.php?action=register

and register a username and password, then use this to go to:

http://lwtfi3m.co/

and use it to log in.

For all the set-up details, follow the rest of that post as DIY Classics (though note point 5 below), here is the link again. But if you are feeling lazy and want to stay here, here’s a brief run-down:

  1. Click on my Languages
  2. Click on the Green plus sign or ‘New Language’ and add Latin/Greek
  3. For dictionary, you want to add:

Latin: http://www.perseus.tufts.edu/hopper/morph?la=la&l=###

Greek: http://www.perseus.tufts.edu/hopper/morph?la=gr&l=###

I would delete the Google Translate URI because it doesn’t exist for Ancient Greek, and isn’t good for Latin.

  1. For Character Substitutions: ´=’|`=’|’=’|’=’|…=…|..=‥|«=|»=
  2. Especially for Greek, you want:

RegExp Split Sentences:  .!?:;•

RegExp Word Characters: a-zA-ZÀ-ÖØ-öø-ȳͰ-Ͽἀ-ῼ

These two values make sure that LWT correctly works out (a) what starts and ends words, and (b) make sure it uses the Unicode set that includes polytonic Greek characters. Both are important in getting Greek to display and function properly. Notice that the RegExp Word Characters is different from what DIY Classics said; I found that didn’t work for me.

  1. Select your Language from the home screen
  2. Click “My Texts”
  3. Click “New Text”, and enter a title, paste in your text, and tag it as you please.

Okay, you should be all set up for Greek and or Latin. Once you’ve done that, go to “read”, it’s the first icon listed on the text page. You’ll get a screen with 4 components:

LWT 1

In the top left is your LWT menu. Notice that it lists 839 “To Do” – those are untagged/unknown words in the text. Next to it is an “I know all” button. Basically click this is you know every single word in a text and couldn’t be bothered tagging them.

Below this is the reading pane. In this is the text you’re working with. You can see in the first screen shot that I’ve left-clicked on ἀπόστολος, which has given me several options.

In the top right pane I’ve got the option to edit this term; it’s listed as a new term, and I can add in both a translation, as well as some tags, perhaps I would add: noun, masculine, nominative, singular. Romaniz is for a Romanisation of foreign alphabets. At the bottom is a coloured status bar: 1-5 of unknown to well-known, then “WKn” for actually so well known you don’t want to worry about it ever again, and “Ign” for Ignore this term, useful if your text contains non words or words not in your target language.

The bottom right pane that’s opened up is the dictionary look up, based on what you listed in Dictionary 1 in the settings. In this case, it’s gone to Perseus just like I told it too. The bottom frame just works like an inset webpage, so you can click through to the LSJ or Middle Liddell entry as you please.

 

Foreign Language Text Reader

I also want to talk about Foreign Language Text Reader (FLTR). FLTR is like a slimmer, maybe-dumber, version of LWT. But its two greatest strengths are that it’s on your computer (without running a server!) and that it’s super simple.

For FLTR head to https://sourceforge.net/projects/fltr/

Follow the download and installation instructions, they are pretty straightforward.

Open up the program and you’ll see a very basic interface.

First you’ll want to find the line of options that starts with Language, click on New, type in your language, say “Greek”, and then we want to edit the settings for the language.

Pretty much the settings are the same as for LWT, so here’s what I use for for Greek:

Char Substitutions: ´=’|`=’|’=’|’=’|…=…|..=‥|«=|»=

WordCharRegExp: a-zA-ZÀ-ÖØ-öø-ȳͰ-Ͽἀ-ῼ

DictionaryURL1: http://www.perseus.tufts.edu/hopper/morph?la=gr&l=###

You may want to come back and increase the text size later as well. I find the default is often too small.

Then you need to add some texts. Personally, I add them via this method:

Each language will have a subfolder wherever you installed FLTR, open this up, and you’ll find a folder like Greek_Texts; save a text file (*.txt) in UTF-8 format in here, and you can use it in FLTR. So, for example, grab a copy of say, 1 Peter, put it into a text editor, switch to UTF-8, and then save to here. Open up FLTR, select Greek, and then select the text. If it hasn’t appeared, click refresh and double-check for it in the right folder. Here’s a screen-shot of some Greek text open with FLTR:

FLTR Greek

You can see that I haven’t used it a lot with Greek, as most of the text is blue, which means a new word you haven’t tagged. Green words are well-known, Yellow words are level 4, pretty well known; the colours shade down to a red which is level 1-2 not very well known. A left click will bring up two pages:

FLTR Greek 2

Firstly it will bring up the editing screen for that word, and you can edit this with relevant information; it will also open in a web browser the linked dictionary. A right click on a word brings up a smaller dialogue box, and I can edit from there as well, without having it force open the web-browser/dictionary option. Here’s another screenshot with a short text about Bonnie Prince Charlie in Gaelic; you can see I’ve worked both with Gaelic and with this text before, because most of the text is coloured in.

FLTR Greek 3

What’s the point? Or, Pros and Cons

A student helpfully asked me how this was better/different to using, say, morphologically tagged Bible-software, a la Accordance, BibleWorks, Logos, etc..

  • It’s personalised, and testable. Every entry is put in by you, and so it’s filled with whatever you wanted to include.
  • It’s geared towards reading and familiarity. It doesn’t mindlessly tell me all the information for each word as I scroll my cursor across, it colour-codes to how well I know the word, and what information I have included.
  • It’s faster than doing this manually. Reader’s editions are great, writing on paper is great. This lets you tag your own texts digitally, and it saves those tags across languages, which is great when you’ve encountered a word once, and then find it again 3 years later in a difficult text.
  • It’s easy to work with the same interface across multiple languages. This is my preferred way of dealing with foreign texts. I use it for Gaelic, Mongolian, French, German, Italian, and am exploring its use for Greek and Latin.

 

Cons:

  • There’s no real way to do actual morphological tagging. So every inflection of ἀπόστολος is going to be a separate entry. LWT does nothing to alleviate this. FLTR does have a little drop down when you’re entering a new word that lists similar words, so if you have entered, say, a different form of the word, you can more or less copy what you had elsewhere. I suspect there isn’t an easy way to fix this, since you would need some way of teaching the software to do morphology for multiple user-inputted languages.
  • It’s slow to get started. Opening up 1 Peter and seeing 839 new words to tag, if you already have some experience in Koine, is not a thrilling experience, because this takes time. If you were starting from scratch in a language, it would be more rewarding. But if you’re already ‘on the way’, then it’s slow to get going. But it pays off. This week I opened up a new Gaelic text I’d never tackled, and at least 90% of words were already tagged. This is the pay-off.

Conclusion:

So that’s it. I’d be interested in your feedback, if you’ve had some experience or if you go and try it yourself now. Let me know if you have any difficulties in set-up or need a hand.

 

Tips and Advanced usage

Tip #1: You can select a string of words as a group; this is great if you want to tag a whole phrase that, for instance, might function idiomatically.

Tip #2: FLTR allows you to select “Vocabulary” as a text. This will let you filter a range of vocabulary by ‘knownness’, from a specific text or all texts, with a number of entries, sorted either alphabetically, by status, or random.

Tip #3: You can also access your FLTR vocab as two sets of files in your main FLTR directory, one is a plain text, say Greek_Export.txt, while the other will be a comma separated version, Greek_Words.csv. These files aren’t very useful to look at, but they are the same as the LWT TSV export, so you can actually move between the two programs.

Tip #4: You can import these exported files (from FLTR or LWT) into an Anki deck, if that’s how you like to operate.

Software for ‘reading’ foreign language texts (1)

So, particularly if you’ve come from a Greek/Latin/Classics background, most of what one is taught is how to do a lot of intensive reading. Intensive reading involves talking relatively small segments of texts and analysing, grammatically, each word and segment, mentally parsing and tagging things, and then understanding how the clause and sentence and paragraph fits together.

There’s a place for that. But it’s generally not the best way to move towards a more fluent reading approach. On the other hand, the grammar-translation approach almost never employs something at the other end of the spectrum, extensive reading (http://en.wikipedia.org/wiki/Extensive_reading). There’s lots of good material and research arguing for the efficacy of extensive reading.

One of the barriers for reading particularly classical languages is that there is simply not enough and not appropriate enough texts for reading. There aren’t really graded readers, Lingua Latina being a sole exception. There certainly aren’t extensive series of readers pitched at stable levels designed to move readers slowly and surely to greater proficiency and expand their vocabulary. And, a great gap is that there isn’t any YA literature. YA literature, I would say, is actually amazingly important; it’s language is just a step below ‘adult’ literature, but it’s interesting (usually) and engaging and adults can read it with genuine enjoyment while at a slightly lower language level.

Anyway, to read extensively in a way that works requires a fairly high level of comprehension. One needs to be recognising upwards of 90% of what’s going on in order to figure out the other 10% from context. Maybe more, maybe less. Certainly there’s a point at which there are two many unknowns and the reader gets lost.

So what if there are no texts suitable for your reading level? This is where I think some reading tools will help a great deal. Basically, we want to remove the barriers that slow reading down to the point of frustration. The main difficulty to be overcome is vocabulary – how do we raise our vocabulary to a level where more and more is comprehensible? What if we accelerated and integrated the ability to look up words, and if we made readily available on the same reading ‘page’ the meanings of every word we’ve ever encountered? That is the premise of the reading software I’ll be looking at in the next two posts. By recording and tagging *every single word*, you can see at a glance what you know, what you have previously encountered, and what you’ve never encountered.

If someone is mainly working with texts that have already been tagged to death, i.e. biblical texts, classical texts in the Perseus collection, then maybe something like this isn’t so necessary, but once you’re outside those corpora, essentially one is tagging one’s own texts. This is a digital way to do it across texts, instead of covering a sheet of paper in notes (I’m sure plenty of us have done that), or some ad hoc software solutions.

 

In my next post I’ll talk through Learning with Texts and Foreign Language Text Reader

Hunting with Where are your Keys?

I had an interesting (new) wayk experience over the weekend, and thought I would share it with you this week. Most of the time that I’m using WAYK it’s to try and communicate pieces of language to someone else. That is, I’m in the role of a teacher, and I’m trying to give bite-sized-pieces to a learner to assimilate and acquire language.

But that is only one side of the equation with wayk – you can equally use the techniques of wayk to obtain language when you’re a learner. Because ‘all’ wayk is is a set of techniques to facilitate and accelerate language acquisition. So over the weekend I sat down with a friend of mine, who is a non-native but fluent German speaker (former German teacher, lived in Germany, has a German wife). He was my ‘fluent-fool’ for the session.

I explained very briefly what I was going to do, and then set up a table. To be fair, I already had some vocabulary and grammar and initial sentences, and knowing “Was ist das?” probably really helped too, but other than that my German is really at a low level.

It was very interesting once we shifted into German. He quickly understood generally what I was trying to do, and specifically what I was trying to elicit. I did have a prop problem in that I didn’t set up red-pen/black-pen clearly, and so I could draw out adjectives from him. At the same time, the difference in types of pens that I had on the table launched him into a fully German explanation of different words for ‘pen’ in German!

One of the things I appreciated was that he mainly stayed in our target language, and would often go on in Deutsch for quite a while. I think in different contexts the learner would need to control this, but for me it was entertaining, and mostly comprehensible. When I wanted to bring it back to earth, I would just go back to constructing simple sentences to check ‘is this right?’

A couple of times he veered into English language grammar explanation. Again, for me this was fine, partly because I don’t mind learning grammar, partly because of the informality of our session. I think in a more focused series I would want to encourage him back into the German more frequently.

One of my reflections is that when you stay in the TL, and you’re trying to elicit certain structures, it really becomes apparent more quickly when things aren’t going to plan. I.e. you had a bad set-up, you can’t get the piece of language you want, and you can’t express what it is you’re trying to elicit. So much is about set-up, get it right and you’ll get the language you’re hunting.

Theory Friday: Flashcards

One of my sidelines this year is to tutor an hour a week for students who are tutoring other students in introductory Greek. It always seems complicated to explain that. We are only in our second week, and in fact the student-tutors have not yet commenced their tutoring of their students, so we have taken the opportunity to do a little bit of meta-thinking about language acquisition/teaching and methods. Of course, you know this is just the kind of thing I like to do.

What follows is a post-factum write-up of some of the things I covered.

 

This week we spent some time thinking and talking through Flashcards. Good old flashcards!

What is the quintessence of the flashcard, physical or digital? I take it, that it is the direct correspondence of one discrete unit of information with another. This is both the genius and the weakness of the flashcard approach. For, the flashcard can never get away from this 1:1 correspondence model, even when it becomes something like X:y,z,a,b,c – it is still operating on a correspondence model. At the same time, this segmentation and compartmentalisation is what allows it to work so well for massive rote-processing. Once we accept this limitation, we can think through two related questions:

  1. How do we mitigate the traditional weaknesses of flashcards?
  2. How do we complement the use of flashcards for better learning outcomes?

I’ve gone back and forth on flashcard use. I think that overall they are an inferior method of learning vocabulary in general. But they do have their uses, which is why I swing back to using them occasionally. Their advantage is that they allow massive rote learning of vocabulary by a relatively automatic process. This is very useful for initial stages, at which constructing materials or finding texts that allow high comprehension is difficult. This is one reason flashcards should, in general, be built on corpus frequency – high frequency vocab initially acquired by flashcards can rapidly be both solidified and nuanced by extensive comprehensible input.

The weaknesses of flashcards include some of the following:

  1. Encouraging a correspondence/translation approach to language
  2. Reinforce native-language thinking patterns
  3. Present words in a decontextualized manner
  4. Prioritise Visual-Textual learning processes
  5. Ineffective for structures

Point 1 is the most difficult, because of the quintessence of flashcards. I think it best to mitigate this through complementary approaches. Point 3 can be mitigated by including contextual information on one side of the ‘card’ – a sentence, clause, or even a phrase, can contextual new information in a way that provides more language-oriented material than just ‘mental factoid’ gloss. This sentence or phrase should be relatively simple, it should be something the learner can process without any great mental difficulty – i.e. they shouldn’t operate in a sentence beyond what the learner is already comfortable with and the rest of the words should be familiar and immediately understood.

Point 4 and point 2 can be mitigated in relatively complementary ways: by replacing native-language glosses with pictures, and/or using audio information. Pictures have two downsides: they require a very large ‘up-front’ cost in generating a deck with relevant pictures (I’m thinking digitally here), and specificity of pictures can be problematic. One has to think carefully how to use a picture to represent a concept in a way that is non-ambiguous. I’ve never seen an audio-to –text deck, but I think it would be brilliant, if one whole ‘side’ of the deck were just audio material. I suppose the extreme would be audio-to-pictures.

Point 5 has to do with things like untranslatable particles/modal markers, etc., things that need to be understood as part of a larger unit. In Greek ἄν is my default example. Practically useless on a flashcard. This can be mitigate by embedding these kinds of words/structures into sentence level units and highlighting/underling/otherwise marking the targeted information. The learner then is responding to some actual language use, while being reminded of the target information.

On to my second question, how to complement flash-cards. As I said earlier, flashcards are really like intense boot-camp for acquiring a basic vocabulary. Personally, I think they make best sense when used solo, when other materials are not available. I would complement it by carefully constructed graded reading material, which is going to establish contextually comprehensible input and increase reading proficiency, and verbal practice of some sort. Flashcards would then be used to pass your time on the bus or something, reinforcing this basic information in another form. Next time I’ll talk about intensive and extensive reading practice and gradation.

Would love to hear your thoughts/experiences/relevant research on the topic of flashcards.

Why People continue to teach via Grammar-Translation

Foreword: this is our last post for 2014. Enjoy your holidays and you’ll hear again from the Patrologist in 2015.

 

In this second of twin posts I’m exploring common ‘defeater’ reasons people give for sticking to GT as a method and rejecting approaches like Communicative Instruction. In each case I give a brief explanation of the belief, and some counter-points.

  1. It doesn’t work

I.e. applying CI or other modern language approaches to Classical Languages ‘doesn’t work’. I’m not sure this is a sincere objection, I suspect it’s rather of the order of ‘I don’t want to engage this idea and I’m blanketing you out’.

The fact that it has not worked, or is not pervasive, or that sometimes it doesn’t go perfectly, are not arguments against it. In fact it has worked and it does work. A few timely videos of those few individuals with a decent speaking facility in Latin or Greek shows that it is by all means possible. It is not just possible for the elite either, it is possible for all students.

  1. Dead languages are different

 

Not heard as often these days, but for quite some time people would say things like, “Latin is no longer spoken, therefore our method of learning must be different.” They were generally not making a comment on the difficulties of learning a language no longer spoken (i.e. lack of speakers to talk with) but asserting a fact about the nature of a no longer spoken language.

 

Which is absolute nonsense. Latin is not different from other languages insofar as it doesn’t have a speaking community (I don’t wish to debate whether it does have such a community at this time). Latin is a language. Which means it can be learnt as a language. The status of any language in regards to the number of speakers currently using it has zero bearing on whether it can be learnt as a language or must be learnt as a ‘something other’.

 

If, heavens forbid, all French speakers were wiped from the face of the earth tomorrow by some new, virulent, French-speaker-targeting super-virus, this would not alter the kind of language that French is. It would certainly create obstacles for anyone wishing to learn French. And, given their sudden fatality, I can’t imagine anyone rushing to do so, but French itself would not have changed.

 

So too with the classical languages: if they are languages, they may be learnt as such.

 

  1. It’s too hard

And remembering arcane rules of grammar that appear once every 10,000 words isn’t hard?

 

Yes, I would say, learning a language is hard. It’s hard to learn it as a spoken language, and it’s hard to do GT. All GT students know that! And all students who acquired an L2 as adults know that it was hard too. It took hours, it was tiring, it involved a lot of interaction with speakers, probably embarrassment, and there were many highs and lows and plateaus as well.

 

But it’s not harder. GT is not only hard, it’s often incredibly boring. CT is hard because it requires more investment, but it yields greater satisfaction, it’s more interesting, it’s more motivating. It’s far better to go home from a lesson of CT having interacted in the Target Language for 1-3 hours, and have one’s head swimming in the TL, than to go home after 1-3 hours of GT with one’s head full of “The wicked sailors gave roses to the good girls.”

 

 

  1. It takes too long

Basically this reason is saying that while CT might ‘work’, it is slower, takes longer, and in the end takes too long for the results it promises. Better, in their view, to stick with GT, which requires less hours and gets us ‘somewhere’ faster.

I am almost convinced this is a valid point. I’ve written several times about how many hours working with Comprehensible Input in the Target Language might be required to achieve decent levels of competency, and they are considerable. Anyone learning a modern L2 knows this. I think those invested in teaching classical languages need to be very up-front and honest that considerable time investment is necessary.

However, what I would say is this: I don’t think GT takes less hours to get to the same place. I think GT takes less hours because it teaches and achieves far less. For GT practitioners to achieve real reading fluency takes many, many hours, which is my contention under point 6. In this instance we should not compare apples and pears. Furthermore, from what I generally hear from school teachers using CI based instruction, their results outstrip traditional methods, especially when (a) they spent a little bit of time prepping their students for the kind of tests that traditional methods favour. If that little bit of prep time isn’t their, CI students often simply don’t understand the jargon of grammar questions. No wonder, since they didn’t need it.

So let’s hold off on conceding that GT is ‘faster’, because it may not be faster and it may not even be to the same destination.

  1. It doesn’t match our goals

 

What are ‘our’ goals? I think this is a really important question, or debate to have. Often it seems like the goal of classical language instruction is to do grammatical analysis, but I’m sure most people don’t actually think this is the goal. Isn’t the real goal to be able to understand, appreciate, interpret, texts in classical languages and so to discuss and engage their ideas and content? Isn’t ultimately the content not the form that interests us? And while content and form are never divorced, just as culture and language are inseparable, they are distinct things.

 

If our goal was to train grammarians, then grammar is what we ought to teach. There’s nothing wrong with being a grammarian, of English or of classical languages. And in fact, probably some people do want to study the grammar of ancient languages. We need those people! But that’s not the goal of most students, or of most programs.

 

CT approaches do match our goals, they drastically and desperately match our goals. The claim that no one needs to know how to order a latte in Koine is irrelevant. That’s not our goal either. The goal of CT is to produce competent users of the language with an active facility that enables reading and comprehension of texts in the target language without recourse to translation or grammatical analysis for the purpose of understanding. (Though translation and grammatical analysis may be done for other purposes).

 

  1. GT is how I learnt, so it works

 

People who end up as teachers of classical languages via GT are the 4%. That is, they are often the small minority for whom GT ‘clicks’, who ‘get it’, who enjoy it, while the rest of the cohort is destroyed by a war of attrition fought with boredom and irrelevance.

 

And some of these teachers get very, very good at Greek, Latin, what have you. Especially those that do doctoral programs that require epic amounts of reading of primary language material. But this is my hunch – it wasn’t GT that got them to that point, it was using GT to render those texts comprehensible, and having a huge exposure to comprehensible texts over time. It was Comprehensible Input that gave them competency in reading directly, and this was only indirectly the result of GT.

 

I could be wrong, but I could be right too. People whose primary discipline is classics or the like, who studied primarily via GT, and who achieve marked ‘fluency’ in reading ability, often have a pop- or folk- view of language acquisition that is poorly informed by research or SLA theory, and dominated by the insular views of their own discipline and experience.

 

Even if it did work for you, why should we stick to a method that works for the 4%? What about the 96%? What if we used methods that meant classical languages were learnable by all, not the self-selective and self-satisfied ‘elite’? Wouldn’t that open up the field for the simple ploughman in the field in a whole new way?

Why People teach via Grammar Translation

In twin posts I’m going to explore some of the reasons people teach classical languages (by which I mean Ancient Greek, Latin, and similar languages that are mostly no-longer spoken and primarily of academic or historical interest) via the Grammar-Translation method (i.e. teaching grammar explicitly and training students to translate into their native tongue for the purpose of understanding). The second post will follow up on this one and tackle some issues more directly.

 

  1. That’s how the Ancients did it

 

People often think that ancient students of foreign languages learnt primarily via Grammar Translation. I think this is incorrect. Firstly, it’s often prejudiced by the fact that the Rhetoric-based education system of Greece, then Rome, included explicit grammar instruction as the fundamental stage of language and literature study. However, this does not mean those students learn either their L1, or their L2s really, via that grammar instruction. in the case of upper-class diglossia among Romans, who often spoke Greek quite well, this should be tempered by the very fact of that diglossia – they had a living Greek-speaking community that they were being initiated into.

 

  1. That’s how we’ve (‘Classics’) always done it

 

Again, largely untrue. This time for two reasons. I recommend anyone interested in this to read two books, Waquet’s Latin: Or, The Empire of the Sign which deals in part with Latin’s socio-cultural place in the 17th and 18th century, and explicitly talks about shifts in pedagogical practices. Secondly, James Turner’s Philology: The forgotten Origins of the Modern Humanities which discusses in depth how, in the Anglophone world, the practice of Philology ‘disciplinised’ into modern humanities, including classics, especially from 1850 onwards. ‘Classics’ as a distinct discipline along the lines we know it today, did not exist before then, and the revered pedagogical practices that dominate it often go back no more than 200 years, or less.

 

  1. That’s how my teacher did it

 

The path of least resistance for teachers is generally to teach what they know, how they learnt it. For many, myself included, Second Language Teaching (SLT) was explicit grammar instruction paired with translation exercises. Regardless of beliefs about SLA, the pressures of teaching often ‘push’ us to simply teach with what is ‘easiest’, and what is easiest in many classes is to pull out a textbook and replicate our own teachers.

 

  1. That’s what worked for me

 

Those that teach classical languages, no mistake, are often those who did really well at them. And with a culmination of points 1-3 the self-fulfilling elitism can be deafening.

 

In a recent discussion relating to why certain advocates of ditching G-T were so down on G-T, someone helpfully pointed out to a newcomer that all the people in the discussion who were down on G-T were those who had been very successful at G-T. This argument isn’t, generally, coming from those who failed because of G-T, but those who succeeded at the 4% method, and have come to consider it deeply flawed. Just because it worked for you, doesn’t mean it is a viable methodology in general. Indeed, the self-selection involved in ‘it worked for me’ actually really means, “it will work for people like me and that the only type of student I care about”.

 

  1. That’s what our goal is.

 

I’m going to tackle this much more thoroughly in the next post on this question, but some people think G-T achieves the kinds of goals we want in these disciplines. What does G-T achieve? It produces Grammarians and it produces Translators. Those are two good things, but is that the goal of classics and related disciplines?

 

One of the problems is that grammarians often try and do linguistics, and when they do it’s usually second-rate linguistics because they’re grammarians. The problem with translators is that they learnt to translate from a language they’re not competent in, instead of achieving competency first and then learning the art of translation. Meanwhile, don’t we actually want to train people as things like historians, litterateurs, theologians?

Exegesis as Reading

A little while ago there was an exchange that started on B-Greek, a place I generally read but do not interact, about Exegesis. Then there were a few blog posts, one by D. Streett, one by B. Hofstetter.

I often deliberately don’t engage in discussions that I don’t have time for, but obviously I have opinions. Particularly outrageous, bombastic ones. So part of my heart warms when R. Buth writes, “This is why I define “exegesis” as learning to extract meaning from a language that one does not control. “

Somewhat like Barry, I had already done a degree that was virtually Literature studies, as well as started down the Classics track, and taught myself the fundamentals of Koine Greek, before I got to seminary. One of the reasons “exegesis” is so problematic is that Biblical Studies got hived off, with so many other humanities disciplines, into a discrete ‘discipline’ about 200-150 years ago. On this regard, see the recent volume by James Turner, Philology: The Forgotten Origins of the Modern Humanities, which among other things does a good job of explaining how humanities ‘disciplinised’.

Exegesis as practised by most biblical studies students is a process of analysis and interpretation of a text at a fine-detail level: grammatical, lexical, syntactical analysis of words, phrases, verses, put together at the level of a paragraph. It rarely moves beyond paragraph level. It is, as Buth points out, often done by students/scholars who actually have no “control”, that is no genuine active competency, in the target language.

Let’s just stop and say, “That’s odd.” We would find that incredibly odd for a modern foreign language student. “Oh, you can’t speak a word of French, but you can analyse to death the syntactic choices of individual sentences in Camus’ The Plague?”

I want to be really clear here: there is a place for fine-detailed studies of grammatical, lexical, syntactical elements of small units. Everyone else calls this linguistics. And many, many of these ‘questions’ disappear, or better yet are disambiguated by a genuine competency in the language.

Take a step back – what is the purpose or goal of exegesis? To acquire a better understanding of the meaning of the text. We can call this ‘exegesis’, but we may as well call it ‘reading’, though we must keep in mind that ‘reading’ here actually means something like ‘interpreting’, i.e. we are engaged in attentive, analytical reading at micro and macro levels. Or, “literary criticism”.

I don’t think calling it ‘reading’ is always helpful. In my school there were some teachers who used to say nonsense like “We don’t interpret the Bible, we just read it.” Which was always doctrinal dribble based on a claim to avoid the theological difficulties that the very idea of ‘interpretation’ generates. No, that won’t do. Reading itself is an interpretive act.

On the other hand, reading here is a higher order activity than mere reading. And it really must go beyond the sentence level. Unless you can get to a level of discussing a whole text – a whole book, then you are missing the integrity of the text and cannot complete your reading. That’s why discourse analysis, or just plain literary criticism, needs to work at the macro-level.

To wrap up, I am constantly amazed to interact with so called ‘critical scholars’ who look at, say, a book like John’s Gospel and see nothing but a pastiche of cut-up pieces that represent a proto-Gnostic text re-edited by a proto-Orthodox edited then re-edited again. Why do they see only that? It’s because they analyse a painting by looking at each blob of paint from a stroke of the brush and consider it a different source. They never step back and see the artistry. Whether they are right or wrong is irrelevant to the fact that they can’t step back and look at the whole, can’t discuss the meaning of the book, can’t discuss themes, genre, art, motifs. Because they can’t decide which of 400 types of genitives the proto-Gnostic redactor meant, and their competency in the language is like a tourist who got off the plane with an antique reference grammar of the language and nothing else.

Why I do Sub-Optimal Language Exercises

Why bother doing anything but the best types of language acquisition activities?

I’m a firm believer in Comprehensible Input, and fairly sold on Krashen et al., that CI is the key to language acquisition. I don’t quite buy Krashen’s “strong” version that nothing but CI is necessary, because I think he’s framing the question a little incorrectly. Krashen these days makes a strong claim that CI, only CI, is sufficient by itself for language acquisition. I think this might be true, but there are other aspects of language competency that are perhaps not quite ‘acquisition’. The ability to speak, write, produce output is probably a secondary outcome of acquisition, but in my view and experience one still needs some practice in these output skills in order to actually output.

Anyway, I do all sorts of activities that are not optimal CI activities. I read texts too difficult for me. I do ‘composition’ exercises that are really translation exercises of banal sentences from English to Greek/Latin. Lately I have been working on an idiosyncratic but modern translation of the New Testament (I’ll write more about that individually later on). Why? Why waste time?

  1. Don’t wait for the best.

There is no way to get optimal CI in Greek or Latin. There’s no language community, there’s no children’s cartoons, there’s no 5 levels of graded readers about contemporary society, there’s no young adult extensive reading materials available. One will never derive enough genuine CI from currently available resources.

  1. Output exercises are nonetheless moderately useful.

Because (a) they develop output automaticity, even if no new language is being acquired. And because (b) the process of doing the exercises does involve some CI even if suboptimal.

  1. The art of translation is itself an art to be acquired.

While it’s generally and genuinely preferable, in my view, to work mentally in the target language, there are times when one will want to translate – in either direction. There are structures of phrasing and thought that come to one naturally, and in the absence of knowing a target language structure, you tend to code switch or break thought. Working systematically to acquire some of these structures will improve translation ability.

  1. For others

I think a previous generation thought you acquired language competency largely by suffering and toil. They were wrong about that, but using sub-optimal methods requires suffering and toil because the amount of time required to get the same amount of genuine CI is so much more. The only way we will produce teachers who are competent enough to utilise more-optimal methods is if we have teachers who are prepared to suffer a little to acquire by the hard way, and generous enough to pass that on by an easier way.

Like a broken record

Q: Patrologist, why do you talk so endlessly about language acquisition?

A: Because our field is so broken. In no other field do so many people who know their target language so poorly talk with such authority. I honestly wish it wasn’t necessary, that we rather lived in a time, an age, a place, where we took for granted that people who studied ancient Greek literature knew ancient Greek, where people learned in Hebrew had learned Hebrew, where scholars of Latin had been schooled in Latin. But we do not live in such a mythical land, we live in its counterfeit where people peddle outdated methodologies to reach inadequate heights.

I believe this is changing, but slowly, and only because some are agitating – pointing out that the Emperor does indeed have no clothes. You can try it at home – approach a Greek professor or a NT one or whatever, and initiate a Greek language conversation. If you don’t get a quick χαῖρε, ὦ μαθητά, πῶς ἔχεις σήμερον; then there really is something wrong.

On the flipside, all I am saying is that we apply Best Practices from contemporary Second Language Acquisition to classical and biblical studies. This should be the least controversial thing in the world. And all I am discussing is how we can do that. There is a long road ahead of us. That’s why I keep talking about the same things over and over. Until the revolution comes.

Why there’s no communicative language approaches in classics in Australia

1. Like most places, Classics and Biblical studies are dominated by teachers who didn’t train in language teaching, know little about language acquisition, and never acquired an active ability in their chosen languages.

2. The population is comparatively small.

3. Modern language teaching in Australia does not have even the small dedicated movement of those interested in fully communicative approaches (TPR, TPRS, etc..), and so there is no possibility of spill-over into classical languages.

4. There’s thus no opportunity for teachers to attend workshops, seminars, etc., to be exposed or trained in these techniques.

5. Most online classes are run in what, for Australians, is the middle of the night, or the mid-morning of the workday, limiting the possibility of participation.

6. Summer intensives, say like those run in the States, Europe, or Israel, all occur in the Summer. Which is not summer in Australia, and so is not the summer break. Due to the extreme distance involved in travel, to participate in one of those intensives (any of them) would cost, I have calculated, anywhere between $3300 and $6800 dollars, and generally one would not get away with less than $4500.

7. The (small) population that are interested in classical languages generally don’t know about communicative approaches to these languages, don’t realise the benefits, don’t understand much about language acquisition, and are often monolingual to begin with, so there is little drive for such an approach.

 

 

Of course, there could be people doing things I haven’t heard about. If you’re in Australia doing communicative-type methods for classical languages, get in touch and tell me I’m wrong!

How fast can one learn a language?

I’ve written on this topic a few times previously, on my former blog, notably here and here. Today I want to explore a different side of this question.

 

I’ve previously suggested that to get to a level of ‘fluency’ in Ancient Greek or Latin, we might estimate 1100 class hours (based on comparison with contemporary Indo-European languages). That might break down to something like

100-150 hours (A1)

160-220 hours (A2)

400 hours (B1)

600-650 hours (B2)

800-900 hours (C1)

1100 hours (C2)

 

Maybe. We just don’t know. While there are certainly individuals with exceptional Latin and Greek ability, we don’t have quanitifiable data on this.

Okay, so I’ve also said we probably need to get students to B2, at least, in a serious language program.

 

At 1 hour a week, 40 weeks a year, this is 15 years. Too Long.

At 3 hours a week, 40 weeks a year, this is still 5 years. Too Long.

At 6 hours a week, 40 weeks a year, we get down to 2.5 years. This would fit in a degree program.

If we wanted fluent speakers, i.e. that was the focus of the program, we need to raise the hours to, say, 1000. Then we really need to make it a full-time course.

At 12 hours a week, 40 weeks a year, this is still 2 years. On a standard 12 contact hours per week per semester, that is a students’ full load. Probably our notion of ‘contact hours’ needs to be radically altered from the ‘lecture + tutorial/seminar’ model of Arts/Humanities Faculties.

At 24 hours a week, 40 weeks a year, a student would complete this many hours in a year. Of course, this is about 4.8 hours in the language, with a teacher, per day. That is going to be both an intensive and extensive course. We will probably not be able to teach them much else. If, heavens forbid, we design a two or three language program, then a year will not suffice.

 

 

All of this is a long, number-crunching prelude to say that it takes too long. As teachers, and even as learners, we must find ways to accelerate the process. To put it into Krashenesque terms, we want to work out how can students get maximum exposure to comprehensible input that is in the ‘sweet spot’, that is, input that is interesting, and at the limit of their comprehension so that they are always getting things that they can understand, while at the same time acquiring things they previously did not have, while providing enough repetitions, but not excessive repetition of previously acquired structures.

There are definitely no silver bullets for this. Indeed, it will vary for learner, for learning cohort, for life circumstance, and for teacher. However, of this I am certain – part of what it is to be learning as a teacher or self-reflectively as a language learning is figuring out ways to accelerate the language acquisition process. To become more efficient at using the time, and the inputs, available to us.

 

This is one of the things I like about Where are your Keys? Techniques, or “rules of the game”, are designed to be accelerators of learning. That’s why they exist. Every technique is built around that one facet: how to make language acquisition more efficient: more learning, less time. And, that’s why there’s no arbitrary ‘cap’ on Techniques. Sure, there are some, indeed quite a few standard TQs, but there’s no definitive no-more-than-these list. TQs get invented when something doesn’t work, and someone comes up with a way to make it work. TQs are formalising ‘what works’ and applying it. That’s meta-learning.

 

 

Interviews with Communicative Greek Teachers (8): Randall Buth

Quite a few people were waiting to hear from Randall Buth in this series of interviews. Today I present Buth’s answers to my interview questions! Thanks very much to Randall for taking the time to respond to this interview.

You can read the earlier seven interviews here: Sebastian Carnazzo, Michael Halcomb, Christophe Rico, Stephen Hill, Casper Porton, Jason Weaver, Jordash Kiffiak.

 

1. Randall, I wonder if you’d share a little about the environment and methods you were exposed to when first learning the biblical languages yourself?

Before the biblical languages I was given traditional Latin and German high school training. The German was done as “grammar translation” in the first year by a teacher who could not speak the language. That was received well by students and reinforced what we had learned from Latin learning. The second year of German was a bit of a shock for us students because the teacher spoke German(!), had a German speaking wife, and spoke German in class. We just wanted to have things on paper and to give “correct answers.”

Both Greek and Hebrew were first introduced to me as “grammar translation” languages, no different than Latin, with maybe the one difference being that the Hebrew and Greek grammar books had fewer pictures and less cultural background information than was included in the Latin texts. I jumped through the hoops faithfully and even read half of the Hebrew Bible before adding an experience that led to a transformation.

Things changed when I went to Israel and learned to speak Hebrew fluently. In the process, I noticed that my reading of biblical Hebrew changed. It is difficult to fully explain this by analogy or words, but I will give a brief attempt. Basically, Hebrew changed from being very fast, instantaneous crossword puzzles to a real language, to reading a language for content from within the language. I was young, early 20’s, and naively assumed that the field would gradually move in this direction over the coming decades. I could not imagine a program ignoring the benefits involved, nor had I ever met anyone who had gone through this process up to a fluent level that regretted the time spent or did not see it as qualitatively improving one’s reading and access to the text.

Reading theory linguists attribute these outcomes to automaticity where the morphological nuts and bolts of the language are backgrounded and dropped below conscious focus, which allows more of one’s working memory to focus on interpretation and content. In a word, spoken fluency remarkably improves one’s reading skills.

 

2a. Living Biblical Languages was one of the first real attempts to adapt contemporary models of Second Language Acquisition to Biblical Languages, what were the things that personally prompted you to go down that track?

Probably the biggest influence was comparing Hebrew and Greek, though my African experience also helped and will be described below in the second part of the question. The results of the processes of becoming fluent in Hebrew led to a different perspective outside the traditional patterns for training in a classroom and training for Bible translators.

Twenty years after becoming fluent in Hebrew I could compare what I felt and experienced when reading the Hebrew Bible against what I felt and experienced when reading the Greek New Testament. There was a qualitative difference that had to be acknowledged. There was also a kind of brittleness and unnaturalness that I would perceive in discussions about biblical languages with colleagues or in commentaries. I would muse about ways to overcome that for the coming generation. There was no question that being able to think in a language and having the nuts and bolts automatized was an advantage. Unfortunately, automaticity is/was not achieved through grammar-translation. Trying to talk to myself in Greek was a definite wake-up experience in comparison to Hebrew.

The big challenge was finding a way to develop programs to internalize a language that could be run in a classroom. Fortunately, there are modern language programs that have done this successfully.

 

2b. What role did your work in Africa play in shaping this?

In Africa I was responsible for recommending training programs for occasional translation projects. One of the discoveries was finding out that there were no Christian institutions or seminaries to send students where optimal language learning methods were being taken seriously. African translators were multilingual and good language learners but intuitively they were often puzzled and frustrated by what would take place in “biblical language” classes. My sensitivity to the need of a radical, paradigmatic change in biblical studies was reinforced by watching Bible translators from Africa go off for two or more years of training in biblical language(s) and returning with skills far below what is possible, for example, in programs like Goethe Institute for German and German literature. So both the end product, what is achieved in twenty years, and the introductory phase, what is achieved in two years, fall far short of what is possible with human language learning.

Surely institutions interested in the Bible and Bible translation could do better and develop state of the art language learning. If twenty years of grammar-translation do not produce fluency, then what should be done? What could be done?

 

3. How did you first go about developing communicative methods for teaching biblical languages?

Basically, it meant applying what was most efficient in second language theory. Put simply, this means doing what is known to work well, and avoiding what is known to hinder or work poorly for internalizing a language.

One widely recognized and tested introduction into a language is what is known as “Total Physical Response” developed by James Asher in the 1960’s and 70’s.  Something similarly effective and able to be reduced to paper were the “Learnables” picture series developed by Harry Winitz. Both of those methodologies could be directly applied to ancient languages. Probably the first time I worked on the issue was running a Biblical Hebrew workshop for translators in a sub-Saharan African country in 1994. Response was enthusiastic but follow-up became an immediate problem. The first time for Greek was in a one-night per week Greek class in Jerusalem in 1996-1997. Again, class response was very positive and enthusiastic. During the Greek class we worked out the picture sequences that became Living Biblical Hebrew and Living Koine Greek 1, based on the Winitz system. We also worked out some of the classroom TPR sequences for Greek. The next question is connected. See below.

In more recent years we have adopted interactive storytelling techniques from Blaine Ray’s TPRS. This adds a lot of language production on the part of the student. It has also worked well in our fluency workshops which have focused on increasing teachers’ fluency.

 

4. What were some of the difficulties in developing teaching methods and developing materials?

A big challenge was how to provide live, interactive materials for a self-study audience in a low-tech environment. I decided to build on a system that had grown out of the “army successes” of World War Two and later was adopted by the Foreign Service Institute. Those require good student motivation to be successful, something that frequently accompanies self-study learners. The result was what we call Living Biblical Hebrew Part 2 and Living Koine Greek Part 2. These materials would carry on after our Part 1 introduction to the language through either TPR (in live classes) and/or “pictures” (in the Part 1 book). In Part 2, students were given dialogues to listen to and memorize, many audio drills, some grammar notes, and fully annotated readings from original texts, including insights from sophisticated text-linguistic readings. The materials in Part 2 were written on two levels, a main text that could be followed by students of a high school age and above, as well as footnotes that were intended for more of an academic and linguistic audience.

A bigger challenge, though, is in the classroom. Every year I meet teachers who say something along the lines of “yes, I know that you [i.e. ‘me’-RB] are going down the right path, but I am not capable of running a classroom in the language. How do I jump the gap?” That is a big challenge, how does one jump the gap when the teachers have not been trained to speak or think in the language? Our “fluency workshops” have been a first attempt at helping teachers begin to tool up to bridge the gap.

We have been encouraged in the outcome. Quite a few participants, more than we expected, have gone home from the BLC fluency workshops with the confidence, beginning skill sets, and tools to start teaching “communicatively.” Some of these are included among those you have been interviewing.

Another difficulty emerges as one watches interest in this pedagogy grow. An inherent difficulty is making sure the language you are using and internalizing in the classroom is representative of good Biblical Hebrew and Koine Greek (which, unlike modern languages, are removed from us in culture and time with no mother tongue community to consult). This requires a high command of the biblical languages, deep familiarity with the biblical texts, and knowledge of how the languages developed over time. For Greek, this means researching the language in the whole corpa of Koine texts.  For Hebrew, much more limited in outside texts datable to classical Hebrew, it means making decisions how to “fill in the gaps” of words or phrases which didn’t happen to find their way into the biblical text.

Teachers need to produce material and develop material that they do not yet control. This can be deceptively more difficult than teachers may assume. Fortunately, languages can self-correct in a kind of spiral fashion. As a person learns more and produces more language they encounter structures and vocabulary that they would not have used themselves. This is then integrated into their language use and so it goes. However, the lack of production in most traditional programs does not seem to benefit from this spiraling process of self-correction.

Let me illustrate with an anecdote from a meeting on biblical language pedagogy. I was sitting in an audience with another professor, an advocate of ‘grammar-translation,’ listening to a demonstration of communicative techniques that was frankly lacking in many respects. Afterwards, the professor turned to me and commented “What was that? That misuse of the language is exactly why these professors shouldn’t be using communicative methods to teach the language.” I responded, “Please note that all of those up front have PhD’s and were trained in ‘grammar-translation’! You just got to see and hear what is actually going on in their heads when they try to process the language. Yes, it can be shocking. Without meaning to, they just demonstrated why the grammar-translation method does not produce desired or intended results. What the professors need is better materials and more fluency training.” The professor agreed that that was certainly a wake-up call on the inadequacies of grammar-translation.

Let me give a another, small example. If a teacher tries to say (ani) eshev אשב for a simple “I am sitting,” they may not realize that they have just created something that is against biblical Hebrew. (This example really happens in the guild of Hebrew teachers, like in the anecdote above.) If they become more sensitive to the language they will run across patterns like ani yoshev אני יושב in the Bible. The point is that writing materials and putting things in print is a big responsibility for the coming generation and we are not doing this lightly at BLC. We have proceeded through several iterations up the language spiral. A tremendous amount of time is spent getting things correct, finding out how things would have been said in Greek in the first century or by Jeremiah in Hebrew if transported to our time period. The point is that communicative language teaching must be committed to the highest standards of language accuracy and cannot be left in the hands of whatever someone might extrapolate from a dictionary or from a theory about what Greek or Hebrew is alleged to be. We end up interacting directly with the ancient writers and users of the ancient language in ways that are quite heuristic.

As an aside to this question I should probably discuss orality and pronunciation. When someone aims at internalizing a language, that requires using the language rapidly and massively. Rapid use of a language means listening and speaking. And listening and speaking require decisions about pronunciation. The pronunciation needs to be something justifiable that a person can live with after internalizing the language. For Hebrew that was an easy call. We use an “oriental” Israeli system that includes a true pharyngeal `ayin and Het, two sounds that were typically ignored in most biblical programs and that are considered desirable for best Hebrew reading in synagogues and official radio.

Greek was more of a problem. From papyri reading I had already known that Greek pronunciation in the first century was closer to modern Greek than the various artificial systems typically called Erasmian, including the “restored Attic” systems. My first assumption was that a modern Greek pronunciation would be best; it would connect to the Greek people and would be parallel to what was best in Hebrew. However, friends convinced me that seminaries and university profs would not consider using the language materials if they were done “modern.” So I asked my friends, “what if the materials were done in a Koine pronunciation?” Friends and colleagues agreed in principle, though it is probably fair to say that most of them were unaware how far their ‘seminary Greek’ pronunciation was out of sync with the Koine materials they were reading.

As a trained linguist that had worked out a phonology for a complex Nilotic language, I was able to distill the phonological system(s) of Koine Greek and produce a workable synthesis that could be justified and that I would be happy to end up with. The Koine system is accepted by modern Greek speakers as being “Greek” and it historically fits the expectations of what travelers and writers in the first century were exposed to. For modern speakers it only means adding a French “u” (German umlaut u, that is, a rounded high front vowel) and adding a clear [e] sound between the ι and ε. Probably a majority of those who are embracing the need to develop internalization and a spoken pedagogy have come to similar conclusions and are adopting a Koine pronunciation. That is the framework that people would have been using to listen to Paul, Mark, Peter, and Barnabas as they traveled and preached. See here [www.biblicallanguagecenter.com] for more details on Koine pronunciation.

 

5. Reflecting on the courses you’ve been involved in teaching, what sorts of outcomes do you see from students who go through BLCs programmes?

Things depend on motivation, testing methods, and opportunities. We have also had beginning students as well as teachers go through the programs. Unfortunately, the field doesn’t have standardized testing for comparison and measurement.

First of all, if students do the Picture series and/or a first level program of a BLC course, then they are ahead of typical ‘grammar-translation’ students in terms of internalization. Their brains have started to process the language as a language rather than as a math formula. This is most easily seen when such students go on to a modern ulpan program like at Hebrew University. Those with a BLC background do better than a strictly grammar-translation “biblical” background.

Furthermore, if a student diligently uses the audio drills they can achieve quite remarkable levels of vocabulary acquisition. We have been surprised by this on occasion when students will go through the audio materials thoroughly in order to prepare for an intermediate level BLC program. They come into the intermediate level significantly ahead of other students, sometimes even those students who had gone through the first level BLC program. High motivation with multifaceted audio materials can go a long way.

So summarizing, the BLC programs produce a wider vocabulary acquisition and more internalization than “traditional” programs. Knowledge of morphological structure is similar to other programs. Knowledge of discourse structure may be enhanced through the communicative methods. To use metaphors, one may compare ‘grammar-translation’ versus ‘communicative’ to putting a stick in the ground versus growing a plant. The height of the stick or plant after the first year is primarily a reflection of students’ abilities to take tests. But the plant can keep growing into a tree. The stick, on the other hand, is stuck and doesn’t grow into a tree, though it may be replaced by slightly taller sticks at a few intervals. Grammar translation can build the frame of a house, communicative methodologies eventually allow someone to live in the house.

We do not encourage a focus on metalanguage in the beginning level since that actually impedes internalization. Students going into traditional second year programs sometimes need to learn how to spit out rapidly “third masculine singular la-di-da of the something binyan from the blankety root.” That process seems to take about a month to get used to, where such gymnastics are what many teachers and students consider “language learning.” Spitting out that metalanguage means breaking one’s comprehension and communication and stepping outside of the language for a brief moment with every word and every clause. The students from a communicative background have such knowledge available and discussing this rapidly is something fairly easily learned, but it, too, takes some time and practice. In addition, it should be noted that analytical abilities that are prized by academic programs are not something shared equally among language students. Language learning is something that should be available for all, it is part of being human in the image of God, but becoming an analyst is more specialized.

One may say that communicative methodologies “widen the gate” for more students to succeed at beginning levels, and something equally important, communicative methodologies “raise the bar,” they remove the ceiling in language fluency so that students may attain much higher skill levels in the language than are achievable through “grammar-translation.” Twenty years of “grammar-translation” cannot compare to twenty years of “communicative methodology and language use.” Productive fluency is a requirement for high level reading skills. What does this latter mean? It means that one can read directly to the end of a paragraph and know what was said, picturing the whole, without the distraction of having to continually reread every phrase along the way multiple times.

 

6. Lastly, given that there is a growing interest in this area, what are your hopes for communicative methodologies in this field for the next few years, and what do you think is some of the needed work going forward?

In the last decade or so we have seen growing interest among Greek and Hebrew teachers at professional organizations like the Society of Biblical Literature and the Evangelical Theological Society in North America. Both organizations now have sections entitled “Applied Linguistics for Biblical Languages.” These provide a forum for discussing all of the issues that arise around language pedagogy. Pedagogy is not something that can be left in the hands of a grad student with a grammar and an attendance book. It is an academic endeavor in its own right that can reap benefits for the whole field of biblical studies down the road.

I would expect that within a decade there will be positions advertising for faculty with an included criterion “able to speak language X and teach in the language in a classroom.”

When that happens a few times in a few places, the upcoming generation of students will “smell the coffee” and endeavor to become fluent in their language BEFORE their dissertations and hitting the job market.

Until then, it would seem that most students and programs are treating fluency and communicative methodologies as “optional” and either ignoring the methodologies or using them only for widening the gate but not raising the bar. The field vitally needs the new methodologies and the greater fluency that they can produce. The internet is making more texts available to an interpreter and the next generation will need to be more fluent rather than less fluent than previous generations.

I’m not fluent!

It’s a common question, “Are you fluent?” or “How many languages are you fluent in?”

But it’s not a very helpful question, because ‘fluency’ is a very difficult to define concept. Many people have the idea that you ‘learn a language’ and once you have learnt it you are a native-like speaker, with perfect pronunciation and complete mastery. This almost never happens for an L2 speaker.

 

Benny Lewis, of “Fluent in 3 months” fame, in his book of the same name, treats it as basically B2 on the CEFR and above. And he quotes “can interact with a degree of fluency and spontaneity that makes regular interaction with native speakers quite possible without strain for either party”. It doesn’t help that ‘fluency’ is part of this criterion! Personally, I think B2 might peg what most people mean by ‘fluency’ a little too low, I think C1 might match most people’s perceptions a little better.

 

I’m not fluent in Latin or Greek. I don’t even pretend to be. I have far too little experience in speaking and listening, and conversation, to make any claims like this. I can read very competently, I can write moderately well, and I can buy my theoretical groceries (i.e. I have had and can have routine conversations about things I am familiar with). I wasn’t taught Latin or Greek communicatively, so it’s no wonder I’m not very communicative in them. Instead, I studied both in very traditional philological styles, grammar + translation + analysis. So I’m very good at grammar, translation, analysis.

 

However, I think this was a mistake; I think analysis is better achieved by learning communicatively. So in this regard I think of myself as a journeyman, someone who has at least done their apprenticeship and is on the way. One day, perhaps, I will be a master. And I take it that those of us interested in this kind of learning are on a journey together. We’re none of us perfect, but to quote a WAYK aphorism/technique, “We’ll all get there together.” Speaking a language isn’t an individualistic enterprise, it’s about a community of speakers. I’ll help you get further on that journey, and you’ll help me, even if we’re not in the same place to start with, and don’t end in the same place, we’ll both get a little further along the journey, which means the community will grow a little bit too.

 

Fluency is not a point. There’s no end goal where one arrives and says, “okay, I have learnt language X”. Acquisition of a language is not a binary Yes/No state of being. A much better approach is to ask:

 

* Did communication occur?

* Was the communicative event successful? (or was it failed communication? or miscommunication?)

* How successful?

* How ‘fluid’ was the communicative act? (Yes, we could use the word ‘fluent’, but it will only distract us; we are asking whether it flowed or whether it was halting, segmented, disrupted)

* How ‘comfortable’ was the communicative act on both sides? (Was it uncomfortable? Did it feel strained?)

* How much accomodating behaviour was necessary by one party or the other? (i.e. adjustment of language in order to facilitate communication, say, when more precise or fitting terms or structures would be more appropriate but would be less successful)

 

When we start to evaluate communicative actsevents, and discourses, we realise that the same speaker may perform and communicate outstandingly well at one time, but dreadfully at another. Are they a ‘fluent’ speaker? Fluency tries to raise the bar and say, “you need to be competent in all situations and communicative events”. I’m saying, junk that, let’s just work out how to learn from every communicative event that falls short, and work on how to improve not only an individual’s skills, but a language communities’ whole system of communication.