BIK Terminology—

Solving the terminology puzzle, one posting at a time

  • Author

    Barbara Inge Karsch - Terminology Consulting and Training

  • Images

    Bear cub by Reiner Karsch

  • Enter your email address to subscribe to this blog and receive notifications of new posts by email.

    Join 99 other followers

Archive for the ‘Tool’ Category

A glossary for MT–terrific! MT on a glossary—horrific!

Posted by Barbara Inge Karsch on November 3, 2012

In the last few months, I have been reading quite a bit about machine translation. And I also took the opportunity at the recent LocWorld in Seattle and the ATA conference in San Diego to attend sessions on MT.

In Seattle, TAUS presented several real-world examples of what can today be done with the Moses engine. It was refreshing to hear from experts on statistical MT that terminology matters, since that camp, at least at MS, had largely been ignorant to terminology management in the past. Here are a number of worthwhile tutorials on the TAUS site for those who’d like to stay abreast of developments. JDE translation UI

At the ATA, the usual suspects, Laurie Gerber, Rubén de la Fuente, and Mike Dillinger, outdid each other once again in debunking the myth around MT. When fears did come up in the audience about MT and its effects, I had to think of a little story:

In the mid-90s, five of us German translators at J.D. Edwards were huddled in a conference room for some training. Something or someone was terribly delayed, and while chatting we all started catching up on the translation quota due that day. You know what that involved? It involved finding a string that came up 500 to 800 times. After translating it once, you could continue your chat and hit enter 500 to 800 times. See the screen print of the translation software to the right and you will realize that the software didn’t allow a human to translate like a human; we translated like machines, only worse because…oops, the 800 strings are through and you are on to the next source string. Some would call this negligent behavior and I am glad that today we have better translation software and that machines assist with or do the jobs that we are not good with.

Here is an example of where MT should not have been used. Those of you who know German, check out this text translated by MT and you will recognize very easily that, oops, it doesn’t make sense to have a glossary translated by a machine, least of all if there is no post-editing.

Posted in Events, Machine translation | Tagged: , , , | 3 Comments »

ATA impressions

Posted by Barbara Inge Karsch on November 22, 2011

Microsoft ClipArt

I have traveled quite a bit during the last four weeks and it is high time for an update. Let me start with a review of yet another great conference of the American Translators Association in Boston.

At last year’s ATA conference in Denver, I was still stunned because the Association still seemed to catch up with technology and the opportunity to embrace machine translation. This year, I saw something completely differently. Mike Dillinger gave a well attended, entertaining and educational seminar on machine translation. He certainly lived up to his promise of showing “what the translator’s role is in this new business model.”

It was so clear that editing for MT is a market segment on the rise, if not during Mike’s seminar, then during Laurie Gerber’s presentation on the specifics of editing machine translation output. She also shared tips on how to educate “over-optimistic clients”. You add to that Jost Zetzsche’s presentation on dealing with that flood of data, and the puzzle pieces start forming a picture of new skills and new jobs.

Microsoft ClipArtJost’s presentation is very much in line with an article by Detlef Reineke and Christian Galinski in eDITion, the publication of the German Terminology Association, DTT, about the flood of terminology in our future (“Vor uns die Terminologieflut”). To stem the flood, it helps to think of “data,” as Jost did, rather than texts, documents or even segments. He also declared the glossary outdated and announced a bright future for terminology databases. To think about texts, documents, segments, concepts and terms as data is helpful in the sense that data along with solid corresponding metadata have a higher reuse value, if you will, than unmanaged translation memories or the final translation product. That has been terminologists’ message for a long time.

I also attended sessions on translation education, one by the University of Illinois at Urbana-Champaign and one by New York University. Since I will be working with the Translation Center of the University of Illinois on a small research project and am currently preparing the online terminology course that will be part of the M.S. at NYU starting this spring, it was nice to meet my colleagues in person.

Posted in Events, Machine translation | Tagged: , , , | 2 Comments »

Terminology extraction with memoQ 5.0 RC

Posted by Barbara Inge Karsch on August 15, 2011

In the framework of a TermNet study, I have been researching and gathering data about terminology management systems (TMS). We will not focus on term extraction tools (TE), but since one of our tools candidates recently released a new term extraction module, I wanted to check it out. Here is what I learned from giving the TE functionality of memoQ 5.0 release candidate a good run.

Let me start by saying that this test made me realize again how much I enjoy working with terminological data; I love analyzing terms and concept, researching meaning and compiling data in entries; to me it is a very creative process. Note furthermore that I am not an expert in term extraction tools: I was a serious power-user of several proprietary term extraction tools at JDE and Microsoft; I haven’t worked with the Trados solution since 2003; and I have only played with a few other methods (e.g. Word/Excel and SynchroTerm). So, my view of the market at the moment is by no means a comprehensive one. It is, however, one of a user who has done some serious term mining work. One of the biggest projects I ever did was Axapta 4.0 specs. It took us several days to even just load all documents on a server directory; it took the engine at least a night to “spit out” 14,000 term candidates; and it took me an exhausting week to nail down 500 designators worth working with.

As a mere user, as opposed to a computational linguist, I am not primarily interested in the performance of the extraction engine (I actually think the topic is a bit overrated); I like that in memoQ I can set the minimum/maximum word lengths, the minimum frequency, and the inclusion/exclusion of words with numbers (the home-grown solutions had predefined settings for all of this). But beyond the rough selection, I can deal with either too many or too few suggestions, if the tool allows me to quickly add or delete what I deem the appropriate form. There will always be noise and lots of it. I would rather have the developer focus on the usability of the interface than “waste” time on tweaking algorithms a tiny bit more.Microsoft PowerPoint Clip Art

So, along the lines of the previous posting on UX design, my requirements on a TE tool are that it allows me to

  • Process term candidates (go/no-go decision) extremely fast and
  • Move data into the TMS smoothly and flawlessly.

memoQ by Kilgray Translation Technologies* meets the first requirement very nicely. My (monolingual) test project was the PowerPoint presentations of the ECQA Certified Terminology Manager, which I had gone through in detail the previous week and which contained 28,979 English words. Because the subject matter is utterly familiar to me, there was no question as to what should make the cut and what shouldn’t. I loved that I could “race” through the list and go yay or nay; that I could merge obvious synonyms; and that I could modify term candidates to reflect their canonical form. Because the contexts for each candidate are all visible, I could have even checked the meaning in context quickly if I had needed to.

I also appreciated that there is already a stop word list in place. It was very easy to add to it, although here comes one suggestion: It would be great to have the term candidate automatically inserted in the stop-word dialog. Right now, I still have to type it in. It would safe time if it was prefilled. Since the stop word list is not very extensive (e.g. even words like “doesn’t” are missing in the English list), it’ll take everyone considerable time to build up a list, which in its core will not vary substantially from user to user. But that may be too much to ask for a first release.

As for my second requirement, memoQ term extraction doesn’t meet that (yet) (note that I only tested the transfer of data to memoQ, but not to qTerm). I know it is asking for a lot to have a workflow from cleaned-up term candidate list to terminological entry in a TMS. Here are two suggestions that would make a difference to users:

  • Provide a way to move context from the source document, incl. context source, into the new terminological entry.
  • Merging terms into one entry because they are synonyms is great. But they need to show up as synonyms when imported into the term base; none of my short forms (e.g. POS, TMS) showed up in the entry for the long forms (e.g. part of speech, terminology management systems) when I moved them into the memoQ term base.

imageMy main overall wish is that we integrate TE with authoring and translation in a way that allows companies and LSPs, writers and translators to have an efficient workflow. It is imperative in technical communication/translation to document terms and concepts. When this task is put on the translators, it is already quite late, but it is better than if it doesn’t happen. Only fast and flawless processing will allow one-person or multi-person enterprises, for that matter, to carry out terminology work as part of the content supply chain. When the “fast and flawless” prerequisite is met, even those of my translator-friends who detest the term “content supply chain” will have enough time to enjoy themselves with the more creative aspects of their profession. Then, economic requirements essential on the macro level are met, and the need of the individual to get satisfaction out of the task is fulfilled on the micro level. The TE functionality of memoQ 5.0 RC excels in design and, in my opinion, is ready for translators’ use. If you have any comments, if you agree or disagree with me, I’d love to hear it.

*Kilgray is a client of BIK Terminology.

Posted in Designing a terminology database, memoQ, Producing quantity, Selecting terms, Term extraction tool, Usability | Tagged: | 3 Comments »

Doublettes—such a pretty term, yet such a bad concept

Posted by Barbara Inge Karsch on June 10, 2011

Sooner rather than later terminologists need to think about database maintenance. Initially, with few entries in the database, data integrity is easy to warrant: In fact, the terminologist might remember about any entry they ever compiled; my Italian colleague, Licia, remembered just about any entry she ever opened in the database. But even the best human brains will eventually ‘run out of memory’ and blunders will happen. One of these blunders are so called doublettes.

According to ISO TR 26162, a doublette is a “terminological entry that describes the same concept as another entry.” Sometimes these entries are also referred to as duplicates or duplicate entries, but the technical term in standards is doublette. It is important to note that homonyms do not equal doublettes. In other words, two terms that are spelt the same way and that are in two separate entries may refer to the same concept and may therefore be doublettes. But they may also justifiably be listed in separate entries, because they denote slightly or completely different concepts.

As an example, I deliberately set up doublettes in i-Term, a terminology management system developed by DANTERM: The terms automated teller machine and electronic cash machine can be considered synonyms and should be listed in one terminological entry. Below you can see that automated teller machine and its abbreviated form ATM have one definition and definition source, while electronic cash machine and its abbreviated form, cash machine, are listed in a separate entry with another, yet similar definition and its definition source. During database maintenance, these entries should be consolidated into one terminological entry with all its synonyms.


It is much easier to detect homographs that turn out to be doublettes. Rather, it should be easier to avoid them in the first place: after all, every new entry in a database starts with a search of the term denoting the concept; if it already exists with the same spelling, it would be a hit). Here are ‘homograph doublettes’ from the Microsoft Language Portal. While we can’t see the ID, the definition shows pretty clearly that the two entries are describing the same concept.


Doublettes happen, particularly in settings where more than one terminologist adds and approves entries in a database. But even if one terminologist approves all new concepts, s/he cannot guarantee that a database remains free of doublettes. The right combination of skills, processes and tool support can help limit the number, though.

Posted in iTerm, Maintaining a database, Microsoft Language Portal, Process, Setting up entries | Tagged: , | 4 Comments »

Greetings from Budapest!

Posted by Barbara Inge Karsch on April 13, 2011

Here is a virtual postcard from memoQfest in Hungary’s capital.Sun setting over Hero's Square by BIK

After two days of train-the-trainer courses, master classes on the new Kilgray terminology tool and memoQ, energy is running as high as ever at memoQfest. For those of you who are not familiar with memoQ, check out the Kilgray website and background information. Though no longer a translator, I tested the tool for the first time in fall and liked it right away. The structure and interface of the tool is so simple that I was working away on a translation within two hours.

As a terminologist, I struggled with the term base part of memoQ: It seemed too simplistic and simply not extensive enough. That said, if it is paired with a solid terminology management tool in the background (see qTerm), the current functionality passes muster as a quick way for translators to add terms that hadn’t been added to the terminology beforehand. And which project ever has enough terms documented upfront (How many terms do we need to document)?

Besides the fact that I very much like memoQ and will continue to watch and, where possible, assist with the development of qTerm, I simply enjoy the Kilgray culture. We closed out the night at a wine tasting, but spirits would have been high even without a bit of alcohol.

Posted in Events, Tool, Translator | Tagged: , , | 2 Comments »

A new tool, a new app, a new what?

Posted by Barbara Inge Karsch on January 21, 2011

Enterprise terminologists generally don’t have the easiest job—nobody understands what they are doing, most people don’t know that they exist, and some even refuse to cooperate. A widget may be just what they need.

A widget, really? While we can argue about the (code) name of the new SDL MultiTerm Widget, the concept behind it is a good one: It is a small application that anyone in a company can use to look up the meaning of a term. They just need to highlight the term, and the application displays a hit list, either from the company terminology database (MultiTerm, of course), a search engine or any website a user indicated in the app beforehand. A few different user scenarios for the Widget come to mind.Widget Results courtesy of SDL Multiterm

If I were still a corporate terminologist, I would put on a major campaign to introduce the Widget to any communication professional through a video, a brown bag meeting, or simply an e-mail. The main focus would be on how easy it is for lawyers, trainers, marketing and branding experts, etc. to use corporate terminology consistently. As non-terminology experts, these professionals cannot bother using a terminology-expert tool. They need information, and they need it fast.

Much to my chagrin, a link to LEO, a German-English online dictionary, was embedded in the German Microsoft intranet site. Now, there is nothing wrong with an online dictionary, but it was hard to turn people’s attention to the corporate database from this simple link. Since most terminology teams don’t have huge funds for tools development, the Widget could be that simple solution to steer employees away from unmanaged and to managed corporate terminology. If you put correct and standardized terms at their fingertips, they’ll use it.

Another scenario that came to mind when I saw the Widget the other day is visitors from subsidiaries. At J.D. Edwards, German consultants would come to the Denver headquarters fairly often to attend training session on the newer technologies. Their English was quite good, but they were not always familiar with every new term. They would ask us for glossaries to assist them during the training. If they had such a tool while they were working on a project in class, they could look up critical terms in the database.

Eventually, you would want the app to allow users to share terms that are not yet part of the database. We had an integrated terminology workflow with suggestion functionality at J.D. Edwards (see Perspectives on Localization) and later at Microsoft. Small terminologist teams at large companies need to stem a flood of unmanaged terms, and the closer they are to expert information, the better.

If the Widget doesn’t take off, it’s time for Michael W. to go join Kilgray and work on qTerm. But if SDL is smart, they price it for the masses and give enterprise terminology management a major boost.

Posted in Branding, Content publisher, Subject matter expert, Tool | Tagged: , , , | Leave a Comment »

Gerunds, oh how we love them

Posted by Barbara Inge Karsch on December 9, 2010

Well, actually we do. They are an important part of the English language. But more often than not do they get used incorrectly in writing and, what’s worse, documented incorrectly in terminology entries.

I have been asked at least a few times by content publishers whether they can use gerunds or whether a gerund would present a problem for translators. It doesn’t present a problem for translators, since translators do not work word for word or term for term (see this earlier posting). They must understand the meaning of the semantic unit in the source text and then render the same meaning in the target language, no matter the part of speech they choose.

It is a different issue with machine translation. There is quite a bit of research in this area of natural language processing. Gerunds, for example, don’t exist in the German language (see Interaction between syntax and semantics: The case of gerund translation ). But more importantly, gerunds can express multiple meanings and function as verbs or nouns (see this article by Rafael Guzmán). Therefore, human translators have to make choices. They are capable of that. Machines are not. If you are writing for machine translation and your style guide tells you to avoid gerunds, you should comply.

Because gerunds express multiple meanings, they are also interesting for those of us with a terminologist function. I believe they are the single biggest source of mistakes I have seen in my 14 years as corporate terminologist. Here are a few examples.

Example 1:

Example 2:



In Example 1, it is clear that logging refers to a process. The first instance could be part of the name of a functionality, which, as the first instance in Example 2 shows, can be activated. In the second instance (“unlike logging”) is not quite clear what is meant. I have seen logging used as a synonym to the noun log, i.e. the result of logging. But here, it probably refers to the process or the functionality.

It matters what the term refers to; it matters to the consumer of the text, the translator, who is really the most critical reader, and it matters when the concepts are entered in the terminology database. It would probably be clearest if the following terms were documented:

  • logging = The process of recording actions that take place on a computer, network, or system. (Microsoft Language Portal)
  • logging; log = A record of transactions or events that take place within an IT managed environment. (Microsoft Language Portal)
  • Process Monitoring logging = The functionality that allows users to …(BIK based on context)
  • log = To record transactions or events that take place on a computer, network or system. (BIK based on Microsoft Language Portal).

Another example of an –ing form that has caused confusion in the past is the term backflushing. A colleague insisted that it be documented as a verb. To backflush, the backflushing method or a backflush are curious terms, no doubt (for an explanation see But we still must list them in canonical form and with the appropriate definition. Why? Well, for one thing, anything less than precise causes more harm than good even in a monolingual environment. But what is a translator or target terminologist to do with an entry where the term indicates that it is an adjective, the definition, starts with “A method that…”, and the Part of Speech says Verb? Hopefully, they complain, but if they don’t and simply make a decision, it’ll lead to errors. Human translators might just be confused, but the MT engine won’t recognize the mistake.

So, the answer to the question: “Can I use gerunds?” is, yes, you can. But be sure you know exactly what the gerund stands for. The process or the result? If it is used as a verb, document it in its canonical form. Otherwise, there is trouble.

Posted in Content publisher, Interesting terms, Machine translation, Setting up entries, Translator | Tagged: | 4 Comments »

Machine translation and excellence

Posted by Barbara Inge Karsch on November 2, 2010

What does machine translation do in a blog on terminology management? And how on earth can MT and excellence appear in the same sentence? Terminology management is one of the cornerstones of MT. And excellence isn’t tied to a technology, it is tied to you!

Click on the ATA logo to visit the official website.After a seven-year hiatus, I finally joined my friends in the American Translators Association again this year. And last week, I was in my former US hometown of Denver at the annual conference of the ATA. I was surprised about a couple of things.

For one, the profession has matured incredibly since I last attended the conference in Phoenix 2003. The number of government representatives who attended and spoke coherently about the field bears witness to that. There was great news coverage. But most of all, I could sense a different attitude among attendees many of whom I have known since I first joined the ATA in 1996: There is pride in what we do and the courage to stand up for it!

The other surprise was that this apparently was the first time a real interaction between representatives of human AND machine translation took place (“Man vs. Machine,” a panel discussion of representatives from both camps moderated by Jost Zetzsche). That is stunning to me, since I spent the last six years in an environment where MT is routine.

As terminologist, I look at MT as an opportunity for cooperation. In fact, it was Microsoft Research, where machine translation research is located within Microsoft, who declined, when we first suggested collaboration years ago. It made sense to us to supply well-researched terms and appellations to support MT. Terminology from Term Studio has since been integrated into the MT process at Microsoft.

I suppose it is fear that still seems to have a hold on translators. It might be the fear of losing market share, of needing to change to more tools or automation, or of failing with clients. Let’s go through this one by one.

In my mind, there is one market for translation; we could even say one market for content production in target and/or source languages. This market consists of different segments, and we, as language professionals, have a choice on where we want to play. The segments are not strictly delimited, meaning a translator could move between them, but let’s focus on the following three.

    • A Chris Durban, who represented human translators in the panel discussion and who is serving the high-end translation market (marketing, financial reports, etc.), chooses to stay away from automation. I venture to say that her work is better carried out without automation. The key is that she achieves excellence in what she does. And she asks to be paid for it, and paid well.
    • Another translator might choose to focus on the high-volume market of manuals and handbooks in a particular industry. He will work with what Jost Zetzsche calls translation environment tools, short TEnTs. That will enable him to produce higher volumes than Chris, but with equal excellence.
    • And then there are those, such as the translators at PAHO, the Pan American Health Organization, who post-edit machine translation output. Again, they have a different environment, but they do what they do successfully, because they strive to do it well.

At times, an individual might choose to stay in the segment they are in or to make a transition into another segment, which requires flexibility and diligence. If you thought you could get away with less than hard work, you might have chosen the wrong profession (or planet, for that matter). I believe in that, but I also believe in the fun and gratification that comes from delivering excellence.

The last point is working with clients. The need for client education is high. The ATA has contributed a lot in that area. If you are a member, just check out all the different resources available to us. True, it is tough to do client education when you are making a living being paid only for the word in the translation. It takes skill to find the right balance. Nonetheless, clients must be informed about what they ask for, especially those who say that “quality doesn’t matter,” because very likely they have no clue. Once you have done your duty and the client still insists on some form of “quick and dirty,” you can always say no to the job. I saw projects not succeed despite warnings and suggestions. But it is not the end of the world when someone insists on, say, machine translation without preparatory or complementary work and then fails with their own customers. You could consider it self-education. You just don’t want to be in the middle of it.

In my experience, if we aim for excellence, we will be financially successful and professionally gratified. Then, it doesn’t matter so much whether we chose “pure human translation”, decide on some form of translation automation environment, or focus completely on terminology management.

Posted in Events, Machine translation, Translator | Tagged: , | 1 Comment »

What do we do with terms?

Posted by Barbara Inge Karsch on September 23, 2010

We collect or extract terms. We research their underlying concepts. We document terms, and approve or fail them. We might research their target language equivalents. We distribute them and their terminological entries. We use them. Whatever you do with terms, don’t translate them.Microsoft Clip Art

A few years ago, Maria Theresa Cabré rightly criticized Microsoft Terminology Studio when a colleague showed it at a conference, because the UI tab for target language entries said “Term Translations.” And if you talk to Klaus-Dirk Schmitz about translating terminology, you will for sure be set straight. I am absolutely with my respected colleagues.

If we translate terms, why don’t we pay $.15 per term, as we do for translation work? At TKE in Dublin, Kara Warburton quoted a study conducted by Guy Champagne Inc. for the Canadian government in 2004. They found that between 4 and 6% of the words in a text need to be researched; on average, it takes about 20 min to research a term. That is why we can’t pay USD .15 per term.

Note also that we pay USD .15 per word and not per term. Terms are the signs that express the most complex ideas (concepts) in our technical documents. They carry a lot more meaning than the lexical units called words that connect them.

Let’s assume we are a buyer of translation and terminology services. Here is what we can expect:



Terminology work

Number of units a person can generally process per day

Ca. 2000 per day

Ca. 20 to 50 entries

Cost for the company

Ca. USD .25 per word

Ca. USD 55 per hour

Microsoft Clip ArtAt the end of the translation process, we have a translated text which in this form can only be used once. Of course, it might become part of a translation memory (TM) and be reused. But reuse can only happen, if the second product using the TM serves the same readership; if the purpose of the text is the same; if someone analyses the new source text with the correct TM, etc. And even then, it would be a good idea to proofread the outcome thoroughly.

The terminological entry, on the other hand, should be set up to serve the present purpose (e.g. support a translator during the translation of a particular project). But it might also be set up to allow a support engineer in a branch office to look up the definition of the target equivalent. Or it might enable a technical writer in another product unit to check on the correct and standardized spelling of the source term.

I am not sure that this distinction is clear to all translators who sell terminology services. You might get away with translating terms a few times. But eventually your client’s customers will indicate that there is something wrong, that the product is hard to understand or operate because it is not in their vernacular.

There are much more scientific reasons why we should not confuse translation and terminology work; while related and often (but not always) coincidental, these tasks have different objectives. More about that some other time. Today, let me appeal to you whose job it is to support clear and precise communication to reserve the verb “to translate” for the transfer of “textual substance in one language to create textual substance in another language” as Juan Sager puts it in the Routledge Encyclopedia of Translation Studies. If we can be precise in talking about our own field, we should do so.

Posted in Events, Interesting terms, Microsoft Terminology Studio, Researching terms, Setting up entries, Terminology of terminology | Tagged: , | 4 Comments »

Quantity AND Quality

Posted by Barbara Inge Karsch on September 16, 2010

In If quantity matters, what about quality? I promised to shed some light on how to achieve quantity without skimping on quality. In knowledge management, it boils down to solid processes supported by reliable and appropriate tools and executed by skilled people. Let me drill down on some aspects of setting up processes and tools to support quantity and quality.

If you cannot afford to build up an encyclopedia for your company (and who can?), select metadata carefully. The number and types of data categories (DCs), as discussed in The Year of Standards, can make a big difference. That is not to say use less. Use the right ones for your environment.

Along those lines, hide data categories or values where they don’t make sense. For example, don’t display Grammatical Gender when Language=English; invariably a terminologist will accidentally select a gender, and if only a few users wonder why that is or note the error, but can’t find a way to alert you to it, too much time is wasted. Similarly, hide Grammatical Number, when the Part of Speech=Verb, and so on.

Plan dependent data, such as product and version, carefully. For example, if versions for all your products are numbered the same way (e.g. 1, 2, 3,..), it might be easiest to have two related tables. If most of your versions have very different version names, you could have one table that lists product and version together (e.g. Windows 95, Windows 2000, Windows XP, …); it makes information retrievable slightly simpler especially for non-expert users. Or maybe you cannot afford or don’t need to manage down to the version level because you are in a highly dynamic environment.Anton by Lee Dennis

Enforce mandatory data when a terminologist releases (approves or fails) an entry. If you  decided that five out of your ten DCs are mandatory, let the tool help terminologists by not letting them get away with a shortcut or an oversight.

It is obviously not an easy task to anticipate what you need in your environment. But well-designed tools and processes support high quality AND quantity and therefore boost your return on investment.

On a personal note, Anton is exhausted with anticipation of our big upcoming event: He will be the ring bearer in our wedding this weekend.

Posted in Advanced terminology topics, Designing a terminology database, Producing quality, Producing quantity, Return on investment, Setting up entries, Terminologist, Tool | Tagged: , , | 1 Comment »


Get every new post delivered to your Inbox.

Join 99 other followers

%d bloggers like this: