North Carolina Libraries, Vol. 48, no. 3


[This text is machine generated and may contain errors.]





North Carolina Libraries

TABLE OF CONTENTS

THEME ARTICLES: PERFORMANCE MEASURES FOR LIBRARIES
160 Foreword, Jinnie Y. Davis and David M. Paynter

162 But What Does the Data Mean? Getting From What
Happened to Why It Happened, Sharon L. Baker

167 Federal-State Cooperation for Public Library Data
Kitty Smith
173 Use of Staff Output Measures in the Wake County
Public Library System, Val Lovett
179 Public Library Evaluation: A Case Study, James J. Govern

184 Quantity is Not Necessarily Quality: A Challenge
to Librarians to Develop Meaningful Standards of

Performance for Library Reference Services,
Patsy J. Hansel

189 Performance Measures in Youth Services,
Rebecca Sue Taylor

193 Performance Measures and Technical Services:
Efficiency and Effectiveness, Karen S. Croneis
and Linda H. Y. Wang

197 Performance Meaures for Online Systems,
John Ulmschneider and Patrick Mullin
205 Theory into Practice, Patricia M. Kelley

209 The Evaluation of Service Activities in Academic
Libraries and Criteria for Evaluation Selected by
Administrators of Those Libraries, Sally Ann Stickler

215 Selective Bibliography on Library Performance Measures,
Cynthia R. Levine



ISSN 0029-2540

FEATURES
158 From the President
159 Over to You

218 POINT: Performance Measures: The Pursuit of
Excellence and Accountability, Jerry A. Thrasher

219 COUNTERPOINT: Performance Measures CanTt Quantify
Quality, Harry Tuchmayer

220 Library Research in North Carolina

223 North Carolina Books

229 NCLA Minutes

232 ALA CouncilorTs Report

234 About the Authors

Cover: Valerie Lovett, oUse of Staff Output Measures in the Wake Checkpoint, 212; Ebsco, 161; Encyclopedia Britanica, 183;
County Public Library System,� North Carolina Libraries 48 Faxon, 172; H. W. Wilson, 166; Mumford Books, 159; Quality
(Fall 1990): 173. Books, 192; Salem Press, 187; SIRS, inside front cover; South-
Advertisers: Algonquin Books, 222; Blair Publishing, 178; Book eastern Book Company, 208; Southeastern Microfilm, 165; UNC
Wholesalers, 188; BroadfootTs, 196; Capital Consortium, 228; Press, inside back cover.

Volume 48, Number 3 Fall 1990





Libraries
ww)
ol

se

From the President

Productivity, Literacy, and Democracy. We
are dealing with some heavy issues in libraries
this fall! Issues that are important in the libraries
that we are designing to meet the needs of the
nineties and beyond.

I hope each of you has given these issues
some serious thought as you have attended, or
make plans to attend, your regional Governor's
Conference on Libraries. It is not often we get the
opportunity to share what we think about librar-
ies and their future with such a large audience.
The Division of State Library has done us a great
service by scheduling regional conferences within
easy driving distance of every citizen in North
Carolina. It enables us to attend and to encourage
trustees, friends, and government officials to con-
tribute to the issues and resolutions that will be
taken forward to the State GovernorTs Conference
and then to the White House Conference on
Libraries and Information Science.

The conferences in Charlotte (where Governor
Martin was the speaker ), Durham, Deep Run, and
Winston-Salem were enlightening. Each confer-
ence included representatives from libraries, the
general public, government officials, and trustees.
Yet to come are the conferences in Supply, Frank-
lin, Lenoir, Elizabeth City, and Rockingham. If you
haven't yet participated in this important activity,
PLEASE DO. It is not only important for libraries,
it is important for North Carolina. Call your local
public library if you need more information.

Many of the sections and roundtables have
had meetings and/or workshops during the late
summer and early fall. I donTt dare to mention any
of them specifically for fear of leaving one of them
out. Let me tell you, though, that our association
continues to amaze me with our ability to offer
such valuable continuing education opportunities.
The sections and roundtables seem to have an
endless supply of innovative and creative ideas
for offering workshops that are very worthwhile.
Again, I am proud to say I am part of the North
Carolina Library Association.

Have you made your reservation for the NCLA
bus to Nashville? You should have received a flyer
from Jerry Thrasher (Cumberland County Public

Libraries

[SAG

for the JO

158"Fall 1990





Library, 300 Maiden Lane, Fayetteville, NC 28301)
about the December trip to the Southeastern
Library Association Conference. If not you should
contact him right away to be sure you will be
included. The cost is only $99.00 round trip. I

hope we will have a large number of North Caro-
lina participants in the SELA conference. I look
forward to seeing you there!

Barbara Baker, President

Over to You

Letter to the Editor

Dear Editor,

My attention was recently drawn to a letter
from Tom Broadfoot to the editor of North Caro-
lina Libraries, published in the summer 1990
issue. In that letter, Mr. Broadfoot referred to a
conversation he had with me in early May 1990,
about leather dressings. I have several comments
in regard to the letter.

First, I wish that Mr. Broadfoot had informed
Me that he was publishing an article on this sub-
Ject. I appreciate that he accurately reflected my
advice to him on leather dressing (i.e., donTt use
Vaseline on books). However, it would be common
courtesy to let me know about the article if he
Were talking with me as a part of his oresearch�
into the issues. Furthermore, he should have
verified my name before referring to me in print.

Second, if he has taken the trouble to seek
advice on leather dressings from an expert and

then confirm suggestions by contacting other
experts, I wonder why he does not bother to pay
attention to that advice. There is an extensive
body of knowledge on preserving skin-based arti-
facts; however, not all artifacts can or should be
treated the same way. Vaseline for shoes may be
acceptable (although a cobbler may disagree), but
then no one expects their shoes to last for
hundreds of years.

Third, I am honored that Mr. Broadfoot has
associated me (a preservation administrator) and
my advice with Jan Paris, a respected conservator
with a professional reputation that speaks elo-
quently of her quality, character, and ethics.

Sincerely,

Sandra Nyberg
Preservation Program
SOLINET

nD
ae,

MUMFORD

RELIABLE WHOLESALER
SINCE 1977

¢ Over 90,000 Books in Stock ¢ Discounts up to 70% Off

¢ Over 10,000 Titles e oHands On� Selection

¢ Pre-School Through Adult ¢ 100% Fill

¢ Cataloging/Processing Available © Sturdy Library Bindings

e 13 Years of Service ¢ Now Two Adjacent Warehouses

~Nothing like seeing for yourself.�

MUMFORD LIBRARY BOOKS, SOUTHEAST, INC.
7847 Bayberry Road ° Jacksonville, Florida 32256

(904) 737-2649 1-800-367-3927

North Carolina Representative"Phil May

Fall 1990"159







Foreword

Jinnie Y. Davis and David M. Paynter, Guest Editors

The concept of measuring a libraryTs perfor-
mance by objectively quantifying its outputs " its
services and programs"was introduced to the
library profession at least two decades ago. The
overflow audiences at two sessions on perfor-
mance measurement at the American Library
AssociationTs annual conference in Chicago attest
to the continuing interest of librarians with the
use of output, rather than the traditional input,
measures of how well our libraries are performing.
With the newly published manual Measuring Aca-
demic Library Performance to supplement the
1987 Output Measures for Public Libraries,
librarians now have at least two basic tools to
draw upon in carrying out performance measure-
ment in a relatively easy, inexpensive, and pre-
tested manner.

The timing of this issue of North Carolina
Libraries is intended to keep the idea of perfor-
mance measurement alive in the minds of North

Carolina librarians by exploring various aspects _

and applications in several types of libraries. First,
Sharon Baker differentiates between macroeval-
uative and microevaluative measures and chal-
lenges us to go beyond the former " the collection
of quantitative data to explain how well a library
operates " to incorporate microscale studies that
will help us answer the questions of how and why
the library operates in that way. Baker is also the
co-author, with F. W. Lancaster, of the second
edition of another seminal work on library eval-
uation, Measurement and Evaluation of Library
Services.

Measurement implies the need for quantita-
tive data, and Kitty Smith explores the need for
reliability in data collection by public libraries. In
particular, she explains the role of the nationally
coordinated Federal-State Cooperative System for
Public Library Data in ensuring that comparative
data on public libraries will be available to help
future decision makers.

Jinnie Y. Davis, Library Research in North Carolina editor of
North Carolina Libraries, is Assistant Director for Planning &
Research at the North Carolina State University Libraries,
and David M. Paynter is Director of the New Hanover County
Public Library in Wilmington.

160"Fall 1990

Public libraries in general have had a longer
history of performance measurement than other
types of libraries. Three library directors describe
their experiences in assessing public libraries of
North Carolina, with sometimes surprising results.
Val Lovett reports on data collection on output
measures in the Wake County Public Libraries.
James Govern (Stanly County Public Library),
reporting on the Childers/Van House multiple
constituencies model and on other output mea-
sures developed by the Public Library Association,
shows how even a small public library can make
effective use of performance measures. Patsy
Hansel discusses the use of the Bunge/Murfin
method of unobtrusive testing of reference ser-
vices at the Cumberland County Public Library &
Information Center.

The application of performance measurement
to youth services has not received a great deal of
attention in the literature. Rebecca Taylor offers a
step-by-step approach to undertaking such
measurement techniques and includes an evalua-
tive review of the relevant literature.

Another area deserving more investigation is
the use of performance measures in the technical
services. Croneis and Wang explore issues dealing
with the efficiency and effectiveness of technical
services and emphasize the need for libraries
operating in an automated environment to take a
holistic view of performance measurement.

Automation in libraries offers us entirely new
ways to collect quantitative data for gauging the
performance of a library system. Ulmschneider
and Mullin examine online performance measures
and describe the system-monitoring tools and
their management uses at the Triangle Research
Libraries Network.

Two articles on academic libraries employ the
case study and the survey methodologies to in-
crease our understanding of performance mea-
sures. Patricia Kelley describes her experiences
with one of the best-known examples of the appli-
cation of performance measures in an academic
library, at George Washington University. She
emphasizes the importance of educating the
library staff before establishing a performance





Measures program. A continuing and widespread
reliance on traditional evaluation programs, exist-
ing simultaneously with a belief in the importance
of true performance measurement, are seen in
the results of Sally Ann StricklerTs survey of aca-
demic library administrators.

Finally, Cynthia LevineTs annotated bibliog-
raphy offers the reader wishing to delve into the
literature on performance measurement some
recommended points of departure. We regret the
lack of coverage of school librarianship in this
issue, stemming from a paucity of research and
applications related to output measures in that
area.

While this issue of NCL was being edited,

state and local governments were in the process
of attempting to deal with budget reductions and
demands for improved services. The next decade
threatens to impose further budgetary restrictions
and demands for accountability upon most librar-
ies. Librarians will find it imperative to state
clearly their goals in terms of services and pro-
grams to users, to devise ways of measuring
progress toward those desired outcomes, and to
demonstrate to their funding agencies both the
value and effectiveness of their organizations. We
hope that this NCL issue will impel library mana-
gers to think about the assessment portion of this
process, and to add performance measures to
their tools for rational decision making. At

When it comes to service,
EBSCO believes in o~being there.�T

EBSCO has Sales Representatives who, through the years, have traveled
hundreds of thousands of miles and worn out scores of shoes just to o~be
thereT for our customers. ThatTs because we feel that to offer truly
professional service we must offer truly personal service.

At EBSCO, we think librarians should be served by experienced serials
professionals who will obe thereTT for them. IsnTt that what you expect
from a professional subscription agency?

EBSCO

SUBSCRIPTION SERVICES

8000 Forbes Place, Suite 204 * Springfield, VA 22151
703-321-9630 (Tel) * 800-368-3290 (Outside VA) * 703-321-9159 (Fax)

Fall 1990"161







But What Does the Data Mean?
Getting From What Happened
to Why it Happened

Sharon L. Baker

During the 1960s and early 1970s, the Ameri-
can economy was so favorable that funding for all
types of libraries increased. Librarians received
most of the resources they needed to implement
or maintain services even though they collected
few data on the real success of library programs.
In the last fifteen years, however, the average cost
of running a library has risen faster than its
income. Today, funding organizations expect
libraries to continue providing quality services
while keeping costs down. They also want oproof�
that library programs are operating efficiently
and effectively.

These changes in the funding climate and the
spread of sophisticated evaluation techniques
through society in general have led various state
and national library associations to promote the
use of performance measures in all types of librar-
ies. Some libraries have been slow to adopt these
measures,! but their use is growing.

The Macroevaluation of Library Services:
Learning What Happened

Such performance measures generally em-
phasize the macroevaluation of library services.
As Baker and Lancaster (1990) explain in some
detail, macroevaluation studies measure the
success rate of a system; that is, they describe how
well it operates. The results of macroevaluation
studies can usually be expressed in quantitative
terms, such as the percentage of reference ques-
tions answered accurately. For example, and as
Figure 1 shows, the twelve measures discussed in
Output Measures for Public Libraries? are all
macroevaluation measures. Because such mea-
sures show the level of performance at which a
service is operating at a specific date, they serve
as a benchmark. Library directors can use per-
formance data from their own libraries and from

Sharon L. Baker is an assistant professor at the University of
Iowa, School of Library and Information Science in Iowa City,
Iowa.

162"Fall 1990

comparable libraries to support the argument
that more resources are needed to improve pro-
gram quality. Then, if resources are subsequently
added, the library director can compare results
with this benchmark to see if the service has im-
proved. Benchmark figures can also be reviewed
to determine if the quality or quantity of service is
declining. Indeed, many libraries collect such
performance data to serve as an early warning
signal for trouble spots.3

Unfortunately, librarians who collect this type
of benchmark performance data still have a major
problem. While they know what happened in
regard to a given library program, they often do
not know why a program is or is not successful.
That is, the macroevaluation measures collected
do not give librarians enough information to make
intelligent changes to improve service quality. This
may explain Schlachter and BelliTs discovery that
seventy-eight percent of the California public
libraries that collected performance data made
no changes based on the findings. Some needed
changes may not have been made for quite valid
reasons, such as a lack of immediate resources to
solve specific problems. But the fact that so many
libraries failed to make any changes may indicate
that collection of this type of macroevaluation-
oriented performance data does not, in and of

FIGURE 1.

Macroevaluation Measures Appearing in Output
Measures for Public Libraries (Van House et al., 1987)

Annual library visits per capita
Registration as a percentage of population
Circulation per capita

In-library materials use per capita
Turnover rate

Title fill rate

Subject and author fill rate
BrowsersT fill rate

Document delivery rate

Reference transactions per capita
Reference completion rate
Program attendance per capita





itself, provide enough useful information to
Improve services.

The Microevaluation of Library Services:
Learning Why it Happened

___ Diagnostic information which can be used for
Improvement comes from microevaluation of
library services. Microevaluation investigates how
a system operates and why it operates at a
articular level " that is, what makes it work well

a

Microevaluation investigates
how a system operates and

Why it operates at a particular

level...
SE

or badly. The most important element of this diag-
Nosis is identifying reasons for particular failures.
For example, while it is nice to know that fifty
Percent of a libraryTs patrons did not receive com-
Plete and accurate. answers to their reference
questions, improvements cannot really be made
unless the causes of the problem are pinpointed.
A microevaluation would examine whether the
reference librarians failed to verify the usersT
oreal� information needs, used poor strategies to
Search the catalog or other bibliographic tools for
the answers, or were too busy to accompany
Patrons to the shelves to show them the specific
items that could answer their questions. Micro-
�,�valuation would also look at other reasons for
failure, such as collection inadequacy or poor
Subject access in the card catalog. This type of
Microevaluation study is of greater practical use
to the librarian because it provides guidance
about which actions might be taken to improve
Teference accuracy. That is, a microevaluation
Study of this nature tells us what the performance
Measure (the fifty percent accuracy rate) really
Means.

Although most of the performance measures
Promoted by library associations are examples of
Macroevaluation, librarians can fairly easily ex-
band their data collection efforts to determine
how and why these success rates were obtained
" that is, to include microevaluation. LetTs take a
Simple example.

Output Measures for Public Libraries sug-
8ests that one performance measure " title fill
Tate " be collected using a simple patron ques-
tionnaire.T Generally, each patron who enters the
libr ary during a selected week is given a question-
Naire on which to indicate the works being sought

and whether or not they are found. The form is
turned in as the patron leaves the library. At the
end of the week, the total number of titles found
by patrons is divided by the total number of titles
sought. This gives the library's overall success rate
in filling patron requests for specific items.
Evaluators who stop here will know what is
happening " that is, what proportion of a
patronTs needs for specific materials have been
met " but they will not have the diagnostic infor-
mation necessary to increase their fill rates in the
future. They must go beyond such macroevalua-
tion studies and determine if the reasons why
particular titles are unavailable fall into perceiv-
able patterns. For example, are there major collec-
tion gaps? Are popular titles owned but in quanti-
ties insufficient to meet patron demands? Are
purchased titles stalled somewhere in technical
processing so patrons still do not have access to
them? Are the reshelving procedures so slow that
books are sometimes present in the library but
unshelved so that patrons cannot find them?
Such a microevaluation study is actually quite
easy to perform, if the evaluator simply carries
the data collection efforts a bit further. In the
above study of fill rate, the evaluator should not
stop at simply asking patrons to indicate on a
questionnaire whether they found the titles they
were seeking. Rather, as each questionnaire is
turned in, the evaluator should check the catalog,
the shelves, and the circulation area to determine
why the patron failed to find desired items. For
example, several major problems might inhibit
patron access to specific items: acquisitions
barriers, circulation interference, patron errors
Cin using the catalog or in searching for materials
in the stacks), or other library errors like mis-
shelving. As Figure 2 shows, each of these problem
areas can be broken down even further. In fact,
the finer the analysis, the more likely the evaluator
is to figure out why books in this library are
unavailable for use when patrons want them.
Once the evaluator has determined the problems
that occur most frequently, library practice can
be changed to prevent, or at least decrease, the
chances of those problems recurring. For instance,
if many titles are unavailable because they are
checked out to other patrons, the library can
either shorten the length of the loan period for
popular titles or can buy more copies of them.
Virtually all the macroevaluation measures
that are recommended by library associations
(measures of fill rate, reference accuracy, speed

in interlibrary loan or document delivery, etc.)
can be used as the first step in a microevaluation

study. In most libraries, a committee of profes-

Fall 1990"163

a





FIGURE 2.

List of Reasons for Nonavailability of Titles
Developed by the Iowa City (Iowa) Public Library

The Acquisitions Barrier

The library does not own the title.

The library has ordered it, but it has not yet been
received.

The library has received the title, but it has not yet
been cataloged and processed.

The patron does not know about other options such
as requesting that the title be purchased or asking for it
to be obtained through interlibrary loan.

Circulation Interference

The item is checked out to another borrower.

The item is checked out to Technical Services to be
repaired, re-bound, re-cataloged, or re-labeled.

The item is checked out to oMissing� and has not yet
been replaced.

The item is long overdue from another library
borrower and no decision has been made about whether
to replace it.

Library Error

The item is checked in but is not yet reshelved.

The item is mis-shelved.

The call numbers on the item and in the catalog do
not agree.

The library is unaware that the item is missing (e.g.,
it has been stolen).

The item was not properly checked in; the catalog
indicates it is checked out, but it is on the shelf.

The item is currently in use by a staff member but it
is not checked out.

User Error

The user cannot find the item in the catalog (e.g.,
due to incorrect title or author information or incorrect
search techniques).

The user finds the bibliographic record in the
catalog, but misinterprets the information. For example,
he assumes that the bar code or publication date is the
call number or he records the call number incompletely
or in the wrong number order.

The user locates the title in the catalog and copies
the correct call number down, but he cannot find the
location. For example, he doesnTt understand the signifi-
cance of certain terms or symbols in the call number; he
canTt find the location referred to; he makes mistakes in
the alphabetical or Dewey order; he doesnTt understand
the sequence of shelving units.

The user does not ask a staff member for help at the
catalog or at the shelf. This could be due to his not being
able to find or identify a staff member, to his finding a
staff member already occupied with other patrons, or to
his fear of asking a staff member for help.

164"Fall 1990

sional librarians can examine any type of raw
performance data, isolate those services with
inadequate performance levels, list a number of
possible reasons why performance might be bad,
and develop a oquick and dirty� study to see what
is actually causing the problem. For example, if a
library discovers that few interlibrary loans are
filled within an acceptable period of time (say ten
days), the librarians can generate a list of possible
causes of the poor performance. These may be
related to the characteristics of materials re-
quested (such as the date and the form of publica-
tion), the size or training of the interlibrary loan
staff, membership in a library network, or factors
relating to other libraries (e.g., although materials
are requested quickly, some requesting libraries
may be slow to fill the orders). A fairly quick
evaluation can identify which of these is the most
likely reason for poor performance. Library staff
can then work to reduce the problem. For exam-
ple, if two libraries within an interlibrary loan
network are found to be very slow in filling mate-
rial requests, staff can be advised to seek materials
from other libraries first.

Some librarians may feel that they lack the
necessary expertise to conduct microevaluation
studies. And indeed, issues of validity and reli-
ability should be considered to ensure that the
performance data is accurate.® Validity refers to
whether the evaluator is actually measuring what
is intended and to whether generalizations can be
made from the data collected. Reliability refers to
whether the evaluator can expect to obtain the
same results if the data is collected at a later date
or by a different evaluator.

Aid for librarians who need help with these
or other methodological problems is available in
several forms. Library schools, library associa-
tions, and state library agencies may provide
consultation services, recommend consultants
who are experts in evaluation, or present work-
shops on evaluation techniques. Librarians col-
lecting performance data can also read any of
several recently published books on the topic,
such as Measurement and Evaluation of Library
Services,T If You Want to Evaluate Your Library
...,8 and Are We There Yet? Evaluating Library
Collections, Reference Services, Programs, and
Personnel.® These books discuss some of the finer
points of collecting performance data and cover
both the macroevaluation and the microevalua-
tion of library services. The books also recommend
variations on particular themes (e.g., using separ-
ate fill rate studies for each branch of depart-
mental library and for each format of materials
owned). The Baker and Lancaster title summar-







izes findings of past evaluative studies as well.

Summary

Because of funding limitations facing libraries
today, librarians are collecting more performance
data. Unfortunately, most librarians limit the use-
fulness of the data by collecting information that
focuses almost exclusively on what happened in a
given situation. Such data is useful because it
establishes a benchmark figure against which
future data can be compared. In order to make
real improvements in service, however, librarians
also need to explore why and how things happen
in libraries. That is, librarians need to determine
the causes of particular problems so that effective
Changes can be made.

References
l. Sharon L. Baker, oThe Use of Output Measures for Public
Libraries in North Carolina Public Libraries.� (Iowa City, Iowa:
School of Libraries and Information Science, 1987) Eric docu-
Ment number ED 288 538.
2. Nancy A. Van House et al., Output Measures for Public
Libraries: A Manual of Standardized Procedures. 2d ed.
(Chicago, Ill: American Library Association, 1987).
3. Association of Research Libraries, Office of Management
Studies, Planning for Management Statistics in ARL Libraries.
Systems and Procedures Exchange Center, Kit 134. (Washington,
D.C.: Association of Research Libraries, 1987).
4. Gail Schlachter and Donna Belli, oProgram Evaluation " An
Alternative to Divine Guidance.� California Librarian 37
(October 1976); 26-31.
5. Van House.
6. Terry L. Weech, oValidity and Comparability of Public Library
Data: A Commentary on the Output Measures for Public Librar-
les.� Public Library Quarterly 8 (1988): 7-18.
7. Sharon L. Baker and F. W. Lancaster, Measurement and
Evaluation of Library Services, 2d ed. (Washington, D.C.: Infor-
Mation Resources Press, in press).
8. F. W. Lancaster, If You Want to Evaluate Your Library. . .
(Champaign, Il: University of Illinois, Graduate School of
Library and Information Science, 1988).
9. Jane B. Robbins and Douglas Zweizig, Are We there Yet?
Evaluating Library Collections, Reference Services, Program
and Personnel (Madison, Wis: School of Library and Information
Studies, University of Wisconsin-Madison, 1988).

cl

Southeastern
Microfilm Inc.

Product, Equipment
and Service

The established leader in innovative
approaches to micrographics for
records management.

We Offer:

the most complete line of microfilm
products, equipment and services in
North Carolina;

the only full-service micrographics
processing center in the state;
state-wide equipment service &
maintenance;

on-site microfilming services;
guaranteed ANSI & AIIM state,
federal or Department of Defense
standards.

We are an authorized
micrographics dealer for

MINOLTA

Raleigh ¢ Greensboro * Charlotte * Asheville

Call toll Free: 1-800-532-0217

Fall 1990"165





The One-Stop Source for New Definitions!

Banhart Diction®?

Edited by Robert K. Barnhart

and Sol Steinmetz,

with Clarence L. Barnhart

592 pp. 1990 ISBN 0-8242-0796-3
$49 U.S. and Canada,

$59 other countries.

T he Third Barnhart Dictionary of New English is a lexical index of some
12,000 new words, abbreviations, and acronyms created as a result of
unprecedented scientific, technological, and cultural advances and activities.
A ready reference for the casual reader and language scholar alike, this one-
of-a-kind resource serves as a fascinating record of language in action.

~A highly practical
tool... will be the lin-
guist's treasure in the
twenty-third century...
makes excellent
reading."

"American Reference Books Annual
(on the Second Barnhart Dictionary of
New English)

Authoritative

Scholarship...

Aided by an Editorial Commit-
tee of distinguished language
scholars from around the
world, the editors have created
the most authoritative diction-
ary of new English to date.

"

ts) UNG ooo Eagar lea a 4

166"Fall 1990

.. With Practical

Reference Value

Clearly written and easy to use,

the Third Barnhart Dictionary of

New English consolidates

hard-to-find data in a single,

manageable volume for

readers and researchers at all

levels, in all types of libraries:

a Academic and Research
Libraries"Invaluable for
any student of the English
language, linguistics, or
American culture

= School Libraries"|deal
for language arts or
vocabulary programs

a Public Libraries"A prac-
tical guide for the casual
reader or language lover

= Special Libraries"Useful
for patrons who need to
decipher new words or
technical jargon.

H.W. WILSON oe

Bs VOaNe x iN)

AL VEEN, Us Bite

To Order Call Toll-Free

1-800-367-6770.

For credit card orders request Ext. 8.
In New York State call 1-800-462-6060;
in Canada call collect 1-212-588-8400
Telefax 1-212-590-1617.

Also of Interest

Barnhart Dictionary
of Etymology

by Robert K. Barnhart
1,284 pp. 1988 ISBN 0-8242-0745-9

$59 U.S. and Canada,
$69 other countries

Traces the evolution of more than
30,000 words basic to contemporary
American English.








Federal-State Cooperation for
Public Library Data

Kitty Smith

The information and related services provided
by American public libraries in just about every
Corner of the country are a national bargain. Com-
Pared with what it costs the taxpayer to build,
Maintain, and operate just a few Stealth bombers
Or space shuttles, the public library's cost per unit
of service is certainly a consumerTs obest buy.� At
the same time, public libraries in the United States
make a large contribution to the economic health
Of the nation. On the whole, they constitute more
than a four billion dollar industry, and employ
Over ninety thousand persons. They spend over a
half billion dollars on books and other materials
to provide information at no charge. Their number
of outlets rivals that of the most prolific fast food
franchises.}

oIf WeTre So Smart, How Come WeTre Not Rich?�

Any marketing novice knows that high-quality
Products that meet customer needs, are packaged
to suit the customer, and are offered at an unbeat-
able price, combined with a distribution system
already in place in practically every community,
Should be in a good position to win the lionTs share
of their markets. It follows, therefore, that in an
age when information and information-related
Products are needed in every aspect of daily life,
libraries might be expected to be at the top of the
list of leaders in the information marketplace.

Why is it, then, that public libraries do not
receive top recognition and priority from the
Public at large, and from their elected local, state,
or national governments? Why must public librar-
les constantly struggle just to maintain minimum
funding levels needed to operate effectively? Part
of the answer may be the scarcity of public infor-
Mation available nationally, or even regionally,
about the extent and variety of the benefits of
Public libraries. Creators of policy, administrators,
and citizens must have timely, dependable infor-
Mation if the nationTs public libraries are to con-

tinue providing superior service. Yet until very
Re oo Sc

Kitty Smith is Assistant Professor of Library and Information
Studies at the University of North Carolina at Greensboro.

recently there has been little awareness about
them on the part of government and the public. It
would be an oversimplification to blame the prob-
lem on the librariesT failure to oget the word out.�
Public libraries themselves have not had access to
the kind of comprehensive national information
they need to manage and assess their operations
effectively, let alone to create national apprecia-
tion of libraries.

The purpose of this article is to provide
details on the background, organization, adminis-
tration, and activities of the Federal-State Cooper-
ative System for Public Library Data (FSCS).? This
new, nationally coordinated system holds great
promise for providing the comparative data
needed by libraries and policy makers into the
1990s and the next century.

The Evolution of FSCS

The United States government began collect-
ing information about public libraries more than
one hundred years ago. In the 1867 legislation
creating the U. S. Office of Education (USOE),
Congress described the agency's function and
obligation to

collect such statistics and facts as shall show the condi-
tion and progress of eduéation, to diffuse such informa-
tion as shall aid the people of the United States in the
establishment and maintenance of efficient school sys-
tems, and otherwise promote the cause of education.

Instinctively, the young USOE identified libraries
as an important component in the ocause of
education.� By 1876, the agency had compiled an
extraordinary fund of descriptive and statistical
data. This data appeared in the report on public
libraries in the United States, just in time for the
United States centennial celebration. It was to be
another sixty years, however, until a distinct unit
for library services was authorized within the
USOE. Appropriations for this unit were specifi-
cally ofor expenses necessary for the Office of
Education, including surveys, studies, investiga-
tions and reports regarding libraries.�

Fall 1990"167





In 1937, the heart of the Great Depression,
the unit started collecting statistics and assessing
the condition of the nationTs public libraries. Its
findings were most discouraging, especially in the
rural localities, where libraries and library services
ranged from impoverished to nonexistent.? In
North Carolina, for example, over two-thirds of
the population had no access to a public library
facility, and existing libraries averaged revenues
of only four cents per capita. Statistics like these,
however disheartening, became the primary cata-
lyst for passage of the Library Services Act (LSC)
of 1956, and the subsequent Library Services and
Construction Act (LSCA). Both LSA and LSCA
targeted rural and other underserved segments of
the population for development of library ser-
vices.~ (According to 1988 estimates, one hundred
percent of North CarolinaTs population had some
access to services offered by 347 public library
outlets. Total federal, state, and local operating
receipts averaged about $10.40 per capita.)

Almost from the start, these federal grant
programs reinforced and intensified the role of
the state library agencies. By requiring the indi-
vidual state libraries to plan and oversee the
administration of grants, LSCA provided a model
for federal-state communication and cooperation.
In addition, the states had a powerful incentive to
improve their methods of data collecting. This
enabled them to assess the effects of the federal
grants program and report to federal authorities.
Federal agencies worked with the states to estab-
lish and delineate consistent standards and ter-
minology. The major burden of data collection
came to rest at the state level, while the USOE
library programs office concentrated on the
analysis of data from the states. USOFTs analyses
were used to support federal legislative and
executive initiatives.®

The mid-1960s were an era of massive social
upheaval. Virtually all units of the federal govern-
ment responded with historic activity. The evolv-
ing onew federalism� was reflected in CongressT

... the public libraryTs cost
per unit of service is certainly
a consumerTs obest buy.�

establishment of the National Center for Educa-
tion Statistics (NCES) within a reorganized USOR.
This legislation institutionalized the compilation,
evaluation, and distribution of national education
statistics in the federal government. A National
Conference on Library Statistics, sponsored by
the American Library Association (ALA) and

168"Fall 1990

USOE, was held in 1966. The purpose of the
conference was to bring interested agencies and
persons together to discuss ways of organizing a
national system of data collection to satisfy local,
state, and national needs for library information.
In the same year, ALA published Library Statis-
tics: A Handbook of Concepts, Definitions, and
Terminology. This publication was a significant
improvement in itemizing, categorizing, and defin-
ing data for all types of libraries. An eventual out-
come of the ALA-NCES association was a 1970
report entitled Planning for a Nationwide System
of Library Statistics. Two of its recommendations
were of particular significance for state-federal
efforts: (1) it was essential for NCES and the
states to share responsibility for library statistics
in a ohighly defined, coordinated, and regularized�
program; and (2) such shared responsibility
meant that training programs at the state and
local levels were imperative ofor general under-
standing, accuracy of returns, and compliance.�?

These recommendations set the stage for a
most ambitious project during the 1970s. The
Library General Information Survey (LIBGIS) was
conceived as a national data program that would
coordinate local, state and federal agency efforts
into a comprehensive reporting system. Neverthe-
less, in spite of high initial hopes for its success,
LIBGIS never fully reached maturity as a national
data system. The project lost impetus in the poli-
tical and technological mutations of the late
1970s. In the early 1980s shifts in federal spending
priorities brought LIBGIS to a halt.8

At mid-decade there was still no coordinated,
comprehensive national program of public library
statistics, although prospects for such a system
had not been totally extinguished. State library
agencies were still collecting statistical informa-
tion from and for the libraries in their respective
states. Without any real national coordination,
however, there were some serious challenges
ahead. In an effort to explore contemporary prac-
tices in public library data collection, the ALA
Office for Research investigated the various
instruments used by states for data collection and
reporting. Each state libraryTs forms for public
library data collection were requested, along with
copies of each annual statistical report. The forms
and reports were analyzed to determine common-
ality of data items for possible national and
regional comparison. The conclusions of this re-
search were that there were some rather disturb-
ing inconsistencies from state to state. For
example, the states were using so many diverse
ways to count collection resources of public
libraries that fifty-eight percent of these data





items were unique to only one state. In the areas
of circulation and registration, seventy-three per-
Cent of the data items were unique to one state.
Results in other areas such as interlibrary loan,
income, expenditures, and reference were no
better.9

Rather than focus negatively on these findings,
however, the Office for Research initiated a team
effort with the Chief Officers of State Library
Agencies (COSLA) and the Public Library Associa-
tion (PLA) to identify a core common set of data
items that could be collected in the same way,
using the same terminology in each state.

In 1985, the Department of EducationTs
Library Programs Office and the NCES co-spon-
Sored a very productive pilot project based on the
Common data elements identified by the ALA-
PLA-COSLA group. Fifteen states participated in
the landmark data collection venture. Then, in
1988, Congress passed the Elementary and Secon-
dary School Improvement Amendments of 1988
(P.L. 100-297), appropriating funding that infused
New life into NCESTs public library data activities.
Federal law, for the first time, specifically charged
NCES with responsibility for collecting data on
libraries. Statistics on all types of libraries were to
be included among the ongoing activities of the
Center. The law also mandated representation of
the National Commission on Libraries and Infor-
Mation Science (NCLIS) on NCESTs Advisory
Council on. Education Statistics.This council's
Tesponsibility is to set standards, ensuring otech-
Nically sound data, not subject to political
influence.� !!

Karly in 1988, NCES and NCLIS set up a Task
Force to develop an Action Plan for the Federal-
State Cooperative System for Public Library Data,
as dictated by the School Improvement Amend-
Ments. National and state organizations (i.e., NCES
and the Library Programs Office of the Depart-
Ment of Education, NCLIS, ALA, the Public Library
Association, the Library Administration and
Management Association (LAMA), and COSLA)
appointed representatives who were interested in
and committed to accurate and reliable annual
State and national data. In summer, 1988, NCES
Tequested and obtained the cooperation of COSLA
M appointing a state data coordinator for each of
the fifty States and the District of Columbia.

The Action Plan, as developed by the Task
F orce, includes a universe file (or name authority
file) of all public libraries in the country. In addi-
tion it specifies system operations, the data items
to be collected, definitions, analyses, and publica-
tions to be generated, as well as formats for statis-
tical tables. Currently there are forty-one data

items in the system covering basic statistics for:
the number of service outlets, number of em-
ployees, library income, operating expenditures,
size of collections, service hours, services, circula-
tion, and interlibrary loans. Items will be pre-
sented by state and by population of library
service area.

The Action Plan divides labor among local
and state libraries and NCES. The local public
libraries are responsible for collecting local library
information and relaying it to their respective
state agencies (usually as part of the statesT
normal data-gathering practices). The states, in
turn, provide training for local libraries from
whom they gather statistics, and relay the infor-
mation to NCES in computer-readable format.
Finally, NCES compiles the data submitted by the
states for publication and dissemination. NCES is
also responsible for training and continuing edu-
cation for participating State Data Coordinators.
At this writing all fifty states and the District of
Columbia have named a coordinator. National
training workshops for Coordinators were held in
Annapolis, Maryland, in December 1988, and in
Phoenix, Arizona, in December 1989.

Accurate, reliable data will
help individual libraries
report to their governing
bodies and the public in more
meaningful ways than ever
before possible.

The North Carolina State Library was among
the first group of nineteen state library agencies
to submit FSCS data (in Lotus 1-2-3 format) to
NCES in July 1988. The stateTs participation was
made possible through cooperation between the
State LibraryTs Public Library Development Sec-
tion and the Statistics and Measures Committee,
Public Library Section of the North Carolina
Library Association. Their work resulted in a
revision of the annual data collection forms to
include the data elements prescribed by FSCS,
and revision of the annual statistical report to
incorporate concepts such as ooutput� measures
and comparisons of libraries by population of
service areas.

In 1989, forty-four states and the District of
Columbia participated. In July 1989, NCES,
working with the Task ForceTs Technical Commit-
tee, provided each participating state with a copy
of oDECTOP� (for oData Entry Conversion; Table

Fall 1990"169





Output Program�). This new program, developed
for use on a personal computer, affords quick and
dependable input of data by state personnel and
processing by NCES. DECTOP lets states extract
the FSCS-required data items from their existing
administrative files, input them through a choice
of common application software, and edit for
errors automatically. When the data has been
corrected the states can produce the same tables
as NCES for review before submission. The state
then uses DECTOP to prepare a floppy disk,
which is sent to NCES.

The Action Plan also prescribes the develop-
ment of a universe or authority file identifying
public libraries in each state. For this purpose, the
Technical Committee and NCES will be supplying
the state agencies with oPLUS� (Public Library
Universe System), a customized personal com-
puter application similar to DECTOP. Initial use
of PLUS is planned for 1990.

nace eae a Se EME Shes en ne ee! A
With a permanent, coordi-
nated system of public library
data collection in place,
libraries can make their value
known to those they serve and
those who provide resources
to them.

~oWhereTs the Beef?�T

What are the payoffs expected from total
participation of the states in FSCS?

1. Public libraries can use the uniform statis-
tics to evaluate their own performance, compare
themselves with libraries of similar profile, and
set priorities for the future. Accurate, reliable
data will help individual libraries report to their
governing bodies and the public in more meaning-
ful ways than ever before possible.

2. State and federal library agencies need
good data to plan legislation and budgets that are
cost-effective and make sense in terms of public
need. Statistics are the backbone of the evaluation
of grant and service programs.

3. Private sector firms that do business with
libraries need dependable statistics to generate
useful business and marketing plans.

4. Library statistics are essential to the work
of educators, researchers, and media personnel
for study and reporting.

5. Library professional associations at the
local, state, regional, and national levels count on

170"Fall 1990

library data to develop standards, and present
positions on government programs affecting
library services.!?

6.. Finally, statistics will be integral to the
successful outcome of the second White House

Conference on Library and Information Services
(WHCLIS), which has as its goal the development
of recommendations for the further improvement
of the library and information services of the
nation. William G. Asp, Chair of the White House
Conference Preliminary Design Group, recently
described WHCLIS in terms which might be equal-
ly applicable to the Federal-State Cooperative
System:

o[It] is not an event; it is a process. With library and
information services an integral part of a democratic
society, the process involves people from every state...
to discuss issues of library and information services at all
levels .. . . It is a dynamic process that identifies user
needs as a basis for realistic planning as we approach
the 21st century.� 3

With a permanent, coordinated system of
public library data collection in place, libraries
can make their value known to those they serve
and those who provide resources to them. They
can answer important questions heretofore un-
asked or unanswerable: Have our state and federal
library programs met the goals they were intended
for? Are we getting a fair return in benefits for our
tax dollars? What is the quality of service? Is it
truly equal and available to all, especially children,
the elderly, the poor and others not in the main-
stream? Are our libraries really, as Librarian of
Congress James Billington put it, othe golden
entry point, the point of assurance that there will
be democracy in the future�? 4

References

1. Robert Dugan. Speech given at the Second Annual FSCS
Training Workshop, Phoenix, Arizona, December 4, 1989.

2. Much of the background material for this article was ob-
tained from the Action Plan for a Federal-State Cooperative
System for Public Library Data, developed by the Task Force on
a Federal-State Cooperative System for Public Library Data,
with the support of the U. S. National Commission on Libraries
and Information Science and the National Center for Education
Statistics. (The Task Force, April, 1989); hereinafter cited as
Action Plan. Readers are encouraged to consult the Action Plan
for more detailed information on the FSCS.

3. Action Plan, 5.

4. Ibid., 5.

5. Statistics and Directory of North Carolina Public Libraries,
July 1, 1987-June 30, 1988. Rev. ed. (Raleigh, N.C.: Division of
State Library, North Carolina Department of Cultural Resources,
1989), p. 1.

6. Action Plan, 5-6.

7. Ibid., 6.

8. Ibid.





9. oAnalysis of Library Data Collection� (ALA Office for Re- National Center for Education Statistics, 1989).

Search, 1984), Summary Table One, Book One, 5, photocopied. 13. William G. Asp. oWhat We Are... Who We Are... What We
10. Action Plan, 1. Do� (Fact sheet on the White House Conference), 1989.

ll. Emerson J. Elliott, Acting Commissioner of Education 14. James H. Billington. oWhat We Are... Who We Are... What
Statistics, Washington, DC, to Colleague, November 7, 1989. We Do� (Fact sheet on the White House Conference), 1989.
12. Wanted " Facts About Public Libraries. (Washington, DC: s

U.S. National Commission on Libraries and Information Science;

REEL READERS contains 60 program plans
built around outstanding childrenTs films.
Programs include books, songs, poetry,
flannelboards, read-alouds, crafts, booktalks and
other activities as appropriate.

Among the themes included: PreSchool;
Cats, Colors, Names, Toys, and Winter.
Intermediate; Circus, Dragons, Sea
Creatures, Silly Food, Trading Places, and
Unusual Pets.
ey aay a Intermediate; Adventure, Being Bad,

Meneeier k7 Dinosaurs, Haunted Houses, Movie Making, and

Tall Tales.

: Published by the ChildrenTs Services Section of
The ChildrenTs Services Section Fi the North Carolina Library Association, all
North Carolina Library Association 2 A .
: proceeds go to the Section for funding of future
projects and programs.

Order Form

Mail to: NCLA/ChildrenTs Services Section
c/o Gail Terwilliger
1813 Sunnyside Circle
Fayetteville, NC 28305

Namie peti, Aiea AP OTe ise PL os t eel Day hone

Institution

Address

City

(number of copies) @ $ 10.00 each

(shipping and handling) @ $ 2.50 per copy

Total enclosed. Make check payable to CSS/NCLA

Fall 1990"171





"When we wanted to improve our
serials management, Faxon
responded with DataLinx. We
needed journal availability
ala quickly.
ey gave us online

access to other libraries!
check-in records.

When Faxon responds, the whole
subscriber community benefits.

Faxon has helped us through competitive pricing policies and
global access to publications. Now they're enhancing relations in
the broader subscriber/publisher community by advancing
common data communication standards and promoting shared
resources. In this sense I see them as colleagues."

~ ELLEN J. WArTE, UNIVERSITY LIBRARIAN, LOYOLA UNIVERSITY OF CHICAGO

Helping you manage your world of information.

fa: On To learn more about the Faxon Company, the international subscription
a agency with a commitment to quality service, call1 (800) 766-0039.

172"Fall 1990







Use of Staff Output Measures in the
Wake County Public Library System

Val Lovett

As daily suppliers of statistics to the public,
librarians might be assumed to be comfortable
Using statistical measurement as a tool to study
Staff work production, to evaluate staff effective-
ness, to allocate staff resources, and to establish
Work standards. Hah! The profession is so ambiva-
lent about statistical measurement of staff output
that even comparative research studies are en-
Shrouded with oYes, buts.� As for statistical
Measurement in oneTs own bailiwick, anxiety here
1S the most intense among administrators, mana-
8ers, and staff alike.

I spent a day at the UNC-CH School of Infor-
Mation and Library Science trying to find articles
or research about how output measures, work
Statistics, or any other measurements of staff
Cutput are used to construct budgets, allocate
Tesources, plan new services, design new buildings,
or request additional staff. I found articles about
accuracy in reference work that once again sent
Shivers down my spine; I found information on
how to construct a budget which avoided any
Specifics as to methods used to determine staff
levels; I read some cryptic articles on what types
of data are being collected, mostly in technical
Services departments, but I did not find any
articles on the application of work statistics to
the allocation of library resources or on the con-
Struction of budget requests. I was amazed.

Now I know everyone is looking at everyone
elseTs data. Just last March, if one paused in oneTs
daily routine, one could hear the sound of all the
Public library directors in this state ripping open
the envelope that contained the North Carolina
Division of State Library's annual compilation of
Public library statistics. One could hear the pages
being rifled, the sighs of relief and the groans of
disappointment, as each director compared his or
her library to the closest rivals. One can imagine
the acceptance of the good, the rejection of the
bad, and the rationalization of the ugly.

Managers and administrators routinely use
Statistics to make decisions about library opera-

Sa avin

Val Lovett is Assistant Library Director, Administrative Ser-
Vices, for the Wake County Public Library System in Raleigh.

tions, but they do not use them openly nor do they
use them enough. There is not a healthy balance
between objective measurement and subjective
evaluation. Although we are doing fine with the
subjective assessments, we are too wishy-washy
about the intelligent use of staff work production
data in allocating resources. We talk about
political realities, circumstances beyond oneTs
control, and other stock phrases to wrap ourselves
and our staffs in the cotton wool of unreality that
statistics do not count. Then why are we counting?

The problem begins at home. Administrators
should decide what work production statistics
will be collected, how they will be evaluated, and
how they will be used to make decisions. The data
measurements chosen should relate directly to
the library's mission statement, long range plan-
ning goals, and the current yearTs plan for action.
These selections should be discussed thoroughly
with the library staff, who are not only the primary
collectors of the data, but usually the most resent-
ful and suspicious of its use. No one likes to see
results of his or her work reduced to numbers,
especially when one does not know how those
numbers will be used and may suspect they will be
used against oneself and the status quo.

The manager must overcome this understand-
able staff resistance by using staff input to design
and refine collection instruments. As the advocate
for the use of this data, a manager must convince
the staff of its responsibility for the validity of the
statistics through the staffs reliability in the
collection of the information. In my experience,
the more reliant we are upon human beings to
count ephemeral data, the more unreliable it is.
For example, whether one uses a manual or an
automated circulation system, there is something
tangible to be counted. Contrasted with this, refer-
ence question tabulation is entirely dependent
upon the accuracy of the staff in recording the
data regardless of the method used in collecting
it. When I talk with reference librarians about
improving enumeration, they express their frus-
tration in trying to keep an accurate count when
their focus is on service to the patron. To them
the patron services are the most important and I

Fall 1990"173





agree with that emphasis. The viewpoint often
expressed by reference staff members is that if
there are any doubts more staff is needed, then
othey� ought to come to the library and work a few
days.

It is vital to explain and discuss with the staff
the role that data analysis has in decision making
by library administration so that one can lower
their frustration level. One can demonstrate the
effect good data collection can have on the
library's services. Also, the entire staff should
analyze the data so that further refinement of the
instrument and data evaluation is done by line
and management staffs. This will build credibility
for the process and help eliminate some of the
mystique about use of the results.

Having done this, each year before data collec-
tion begins, the library administration projects
the performance levels it believes the system
should achieve in circulations per capita, turnover
rate of the collection, books processed per hour,
reference questions answered, story hour atten-
dance, or percentage of the population registered
as library patrons. Since data collection is an
ongoing process, the administration is setting
targets to reach for the upcoming year based
upon both past performance and the annual plan

... the more reliant we are
upon human beings to count
ephemeral data, the more
unreliable it is.

for the library system. As mentioned earlier, the
chosen measurements should be an outgrowth of
the mission statement and the goals of the library
system. Then, data collection and evaluation be-
come a method for assessing success in reaching
the objectives set forth in the annual plan for the
library system. Establishing these target levels for
service achievement is similar to the private sec-
torTs setting goals for manufacturing and sales.
Now the administrator and the manager can
discuss in detail the productivity targets for the
branch or the department. They can work togeth-
er from the goals established for the entire organ-
ization to the particular objectives set for the
work unit. In addressing increased productivity,
there is every reason to discuss increasing the
work product by specific percentages or numbers,
for example, increasing the circulation of juvenile
non-fiction by thirty percent during the fiscal
year. The administrator and manager can talk
about the activities and resources needed to ac-

174"Fall 1990

complish this objective. A specific discussion is
more productive than a vaguely stated direction
such as oI want you to work on increasing circula-
tion of the juvenile non-fiction materials.� Working
as a team, they can develop the necessary activi-
ties to achieve their objectives. This process can
be used in all departments of the library, and it
addresses the expected output measures for the
individual work unit.

Staff output measures also can be used to
establish work production standards for individ-
uals as well as the entire unit. As managers, we
must be fair to staff in expecting the same stan-
dard of work from all employees in the same jobs.
The standard should be achievable, but also high
in quality as well as quantity. Low or non-existent
work production standards allow everyone to
achieve a level of mediocrity. In my experience
this has occurred most frequently in the clerical
areas of the library profession such as typing cata-
log cards, editing records, filing cards, or shelving
books.

Librarians become very defensive about estab-
lishing production standards for reference work,
cataloging, or childrenTs programming (i.e. oprofes-
sional work�). I believe we have avoided develop-
ing performance standards for professional and
para-professional positions for several reasons.
The work performance standard in these areas is
more difficult to establish, but not impossible. I
think we resist turning our work into a statistical
measure because we feel it demeans and over-
simplifies what we do. Well, that argument is also
applicable to those jobs in our libraries for which
we are comfortable in using work standards.

All this discussion is the easy part. It is the
prelude. Now one can begin to use the subjective
impressions and empirical data together to under-
stand the dynamics of the library system. When
the empirical information is contrasted with the
subjective, even though many subjective deduc-
tions are valid, there will be some surprises. The
data will assist one in identifying specific differ-
ences among similar situations, the deviations
from the mean and/or the median. Investigating
these highs and lows can bring valuable insights,
with resulting improvements in service. However,
we must design those sophisticated means of
measuring and quantifying that work because of
the important information it can provide for
library management decisions.

Support Services Case Study

a

In 1981, the cataloging and processing back- ~4

log at Wake County Public Libraries was approxi-





Mately six weeks from receipt of the books, with
some problem titles lingering on the shelves for as
Many as six months. Many titles, especially popu-
lar ones, were not received at library branches for
Months after they were available in bookstores.
The branch staff bore the brunt of the public ire
So that the working relationship between public
Services and support services was not genial. At
that time the Support Services Division was pro-
cessing approximately sixty thousand books per
year. The Order Section used the Libris online
ordering and accounting system. The Cataloging
Section used OCLC/SOLINET. The card catalog
had been closed on April 1, 1979, so the public
Catalog was published in microfilm format.

In late 1981, the library director set goals for
the Support Services Division. He instructed the
two managers of the division to reduce the turn-
around time from the receipt of the books to the
Shipping of the books to the library branches to
One week. The only exception was that high
demand materials were to be ready to leave the
building in one day. In addition, books were to be
Ordered and selected so they appeared on library
bookshelves at the same time as they did in com-
Mercial bookstores.

The members of the Support Services Division
achieved those goals within the year. They did this
by meticulously flow charting each step of every
Operation. Then every step in the entire process
Was examined rigorously for its relevance and its
efficiency. What happened in the Processing Unit
is a good example of production standards help-
ing to improve productivity.

The work done by the library processors at
that time was the physical preparation of the book
for the library shelves. Jacketing, pocketing and
Carding, accessioning, property stamping, and
Spine labeling represented the majority of the
work. Book trucks were always conspicously
Sanged up in this area. There were no work
Production standards; everyone simply came to
work and processed books. All the staff felt
Oppressed by the work that was piled up behind
them.

For three months statistics were kept by
individual processors. The work productivity
achieved varied widely. There were several meet-
ings of that staff with the head of cataloging who
was the manager responsible for the unit. The
Staff set a work standard of three thousand books
per month per staff member. After six months the
individual work statistics were reviewed. The
Standard was found to be too high and was
revised to twenty five hundred books per month
Per processor. This standard is in use today.

Today, only in the first rush of the fiscal year
ordering do the processors have a few trucks
backed up. However, they clear them very quickly.
They are processing approximately one hundred
fifty thousand books per year with only one addi-
tional staff person. When there is not enough work
to do, they assist other support services units and
library branches. During the past fiscal year, they
have been instrumental in assisting smaller bran-
ches in linking collections to our CLSI system.
During the upcoming year they will have linking
duties for new books assigned to the unit. This
change will necessitate revision of the work
standards by that staff, the supervisor, and the
manager.

This example illustrates how all the members
of the Support Services Division turned them-

... We resist turning our work
into a statistical measure
because we feel it demeans
and oversimplifies what we do.

selves into customer-oriented, public services
employees. Since 1981, the workload has risen
from sixty thousand to one hundred fifty thou-
sand new books representing approximately
11,000 titles. During these years, this staff also
managed to convert retrospectively all title hold-
ing records to machine-readable format and
install the CLSI circulation and public access
catalog modules. The cataloging standards con-
tinued to be AACR2/MARC format, there were no
compromises in the finished physical product,
books are being received at the same time as the
bookstores put them on their shelves, and the
division has transferred three positions to public
services. (

The results have been improved service deliv-
ery to customers at a lower cost per unit of
production, production expansion which kept
pace with a growing book budget while also being
flexible enough to do retrospective conversion,
and library automation. The production standards
helped improve the work of employees, helped
eliminate non-productive employees, and through
merit raises rewarded excellent employees.

It is also significant that the director did not
tell managers and staff how to achieve the goals
that he set. Since they had the expertise to make
the choices, staff members made those decisions.
The importance for administration is that the
managers and staff made and adopted the
changes rather than having them imposed upon

Fall 1990"175





them from outside the division. This is one way
that staff output measures can improve produc-
tivity and service delivery without using additional
dollars.

Ideally, if we understand the level of work an
employee can achieve, there are many positive
uses for that measure. Take, for example, a refer-
ence librarian. If we establish the number of
reference questions which can reasonably be
answered in an hour, we can extrapolate potential
work load for the entire staff. Then we can con-
struct schedules to meet demand from patrons.
We can pinpoint when demand outstrips human
resources and affects the quality of service the
staff can deliver. We can identify those hours in
the week when that critical point is reached.
When the demand for service has outstripped
available resources, we have the information to
support additional personnel requests with the
budget office. Those personnel requests can be
more accurate than in the past. For example, one
might request two half-time positions to target
overloaded nights and weekends, rather than a
full-time position working some hours where
demand is less critical.

Reference Case Study

The statistics from Table 1 will be used to
discuss several points about staff output mea-
sures. I must confess that Wake County Public
Libraries does not have a performance standard
for the number of reference questions per hour
for a staff member. Therefore, we have to use the
information available to us. [Note: I am not aware
of any existing standard in use for reference ques-
tions, although I am interested in the possibility of
developing one.]

During the preparation of the library systemTs
personnel request for the FY 1991 budget, there
were a number of requests submitted by library
branches for additional personnel to maintain
existing levels of service. There was a subjective
opinion that Branch C should have the first
priority position in that request because it is so

TABLE 1.

busy. The data reveals that although it does field
more questions per hour than any other branch,
there is a relatively comfortable level of average
demand on each staff member. The same cannot
be said of Branch B. Because these statistics do
not include directional questions, instructions on
the use of equipment or reference tools, making
change, or other requests that take time, we knew
empirically and subjectively that Branch B should
have the priority position in new staff requests.

Budget analysts do not conceptualize oservice�
well at all. A statement that reference service at
the branch was deteriorating because the demand
for service is higher than the staff can handle
does not mean much to my budget analyst. Even if
I had stated that at peak hours the staff might as
well stand behind the desk and randomly throw
books at the patrons, while I might have made a
point, I have not proven it. I must translate service
delivery into the language of the budget adminis-
trator, or I will be on the losing end in the struggle
for a greater share of the budget dollar. Therefore,
if I can translate service into a statistical measure
and relate it to a work standard (even if it is more
than a little subjective), then the budget analyst
and I can examine the staffing issue based upon
the reasonably achievable work in a staff hour. A
variant of Table 1 was used in the budget docu-
ment for FY 1991.

In FY 1991, Wake County added more than
two hundred new staff positions, most of them
related to capital projects, such as the new Public
Safety Center which was coming online. There
were only ten positions funded in the County to
deal with growth in existing services. One of those
positions was a new professional position for
Branch B. This is an example of how staff output
measures can add more dollar resources.

Another point to be made is that Branch B
helped itself by positioning itself. At the end of the
previous fiscal year the branch manager told me
he felt the staff was seriously undercounting refer-
ence questions. We talked about the importance
that data had on budget requests. He included
activities to improve data collection in his work

Adult Services Staff Output Measures Estimates for Reference Questions

in Selected Wake County Public Library Branches, FY 1990
nee LE EEE

Branch A

a

Staff Hours/Year 16,000
Hours Open/Year 3,600
Ref. Quest./Year 49,672
Ref. Quest./Staff Hr. 3.10
Ref. Quest./Hr. Open 13.8

Branch B Branch C Branch D
5,000 16,000 10,000
3,400 3,600 3,600

40,091 60,160 31,727

8.01 3.76 3.20
11.8 16.7 8.8

Ee

176"Fall 1990





plan. As a result, the Adult Services Department
recorded fifteen thousand additional questions as
answered.

In looking at the data in Table 1, several other
Staff members have the subjective reaction that
we are either undercounting or using an invalid
Sampling technique. A conversation among the
director, the Adult Services Coordinator, and
myself revealed that the sampling techniques had
been developed primarily to provide collection
development information for Adult Services. The
director and I, however, primarily use them for
Measurement of work load and service delivery,
budget work, and future planning for staff size.
We agree that we need to do more testing and
refinement of our sampling instrument and will
be working on that in the upcoming year.

Allocation/Reallocation of Resources

The toughest part of any administratorTs job
is the allocation or reallocation of resources.
Output measures assist in these decisions. Until
three years ago the Wake County Public Libraries
System only divided its materials budget by the
Categories of adult, childrenTs, continuations, and
Periodicals. Branches purchased what they
needed. In FY 1989, at the request of branch
heads, the budget was subdivided into individual
branch budgets for adult materials. Since then
this has been done for childrenTs materials, Be-
Cause we believe that resources should flow to the
areas of highest use, the branch managers in the
first year advocated a strict appropriation of
Monies based on circulation. With experience,
however, the appropriation has become less abso-
lute, as we also must acknowledge that there is a
floor below which a branch budget cannot fall
without totally crippling service. In my opinion, a
Viable public library branch must have a minimum
Materials budget of $15,000. So, we combine both
Objective measures and subjective knowledge in
establishing branch budgets.

Just as the manager of Branch B positioned
her library to receive additional personnel by
increasing the accuracy of its data collection, a
branch head can affect the amount of additional
Monies allocated beyond its budget floor by pur-
chasing materials which will circulate well and by
keeping the collection weeded so that the turnover
rate will not be affected by dead wood. Wake
County has a tiered library system with no main
library. Regional branches located geographically
throughout the county in population centers pro-
Vide additional resources for smaller popular
lending libraries. Therefore, smaller libraries

which have spent monies on books for which they
had one or two potential readers instead of
borrowing the title from a regional branch will not
have the same service return as the smaller library
which concentrates on purchasing popular read-
ing while borrowing more eclectic items from the '
regional libraries. Appropriate selection can raise
the percentage of the gross circulation the branch
contributes to the system. Circulation goes up,
patrons receive more service; patrons receive
more service relative to other branches, the
branch receives more discretionary money for
books. This outcome reflects the effect that staff

can have in bringing more resoures to their area
of responsibility. By increasing the level of service

delivered relative to the system, reallocation of
resources, in this case book monies, brings more
dollars to that service point. The book circulation
output measure can be used as one of the assess-
ment factors in evaluating the branch managerTs
selection skills.

Conclusion

When an administrator uses statistical mea-
sures to make decisions, there will be unhappy
campers. For better or for worse, each manager or
staff member perceives his or her situation as
unique, outside the statistical parameters, and
having an incredible number of extenuating cir-
cumstances " which he or she will repeatedly
share with you. In a profession where we give very
personalized, customized service to individuals, it
is difficult to accept that all those individualized
units of service do add up to produce bell curves,
means, medians, standard deviations, and chi
squares. It seems inhuman that it comes down to
that. Perhaps that is why we have this dichotomy
within ourselves that statistics apply to everyone
else, but oI need to explain why my situation is
different, so the statistics donTt really count.�

I advocate a team approach in developing
quantitative measurements for a library system.
It helps everyone understand that statistics are
more than numbers. If collected properly, they
can create a vivid picture of the effort a staff
makes in serving its community. They can be
persuasive means of securing additional financial
support. Together, with subjective observations,
they can assist us in making better decisions
about resource allocations. Staff will perceive
decisions made by managers as more rational and
more fair. As with Branch B, perhaps they will use
work performance measures to explain why omy
situation is different� and why oI do need the
additional resources requested.�

Fall 1990"177





A BOY'S WAR
Paxton Davis

In this sequel to his

Being a Boy, Davis

opens with his first

year as a Cadet at

Virginia Military

Institute. A Boy's War

follows Davis as he

enters the Army during

World War II and continues to his twenty-
first birthday"two weeks after his release
from the Army.

ISBN 0-89587-079-7.
$17.95 hardcover.
Black-and-white photographs.

GOLFING IN THE CAROLINAS
William Price Fox

Fox"author of GOLE ING

golfing articles BE Calud

for Sports ~
Illustrated, Golf

Club, Southern

and writer-in-

residence at the

University of

South Caro-

lina"describes the 50 best golf courses in
North and South Carolina.

100 color photographs.
ISBN 0-89587-078-9.
$39.95 hardcover.

MORE MURDER IN
THE CAROLINAS
Nancy Rhyne

NANCY RHYNE

The sequel to Rhyne's
popular Murder in the
Carolinas. A collection
of fourteen famous
murders committed in
North and South Caro-
lina.

ISBN 0-89587-075-4. $8.95 paperback.

DEAN SMITH: A BIOGRAPHY

Thad Mumau
Foreword by

Michael Jordan DEAN SM

This authorized A exes: td

biography of Dean
Smith, basketball
coach of the Univer-
sity of North Carolina
Tarheels, traces
Smith's life and
career. This book
includes interviews
with former players,
opposing coaches and former assistants.

ISBN 0-89587-080-0.
$18.95 hardcover.
Black-and-white photographs.

Ja

TOURING THE WESTERN NORTH

CAROLINA BACKROADS
Carolyn Sakowski

Useful as a history or
a guidebook to
western North
Carolina, this book
traces 21 tours
through the out-of-
the-way places from
Blowing Rock to Flat

Rock, Roan Mountain

to Stone Mountain, Murphy to Sparta.

Over 120 black-and-white photographs.
ISBN 0-89587-077-0.
$14.95 trade paperback.

TAFFY OF TORPEDO JUNCTION
Nell Wise Wechter

A reissue of the 1957
young adult classic about
thirteen-year-old Taffy's
adventure with German
spies on the Outer Banks
of North Carolina during
World War II.

ISBN 0-89587-076-2. $7.95 paperback. Ages 10-14.

Blair books can be ordered directly from the publisher or from
Ingram Book Co., Baker & Taylor, Koen Book Distributors, Broadfoot's or other wholesalers.
Blair allows a 20% discount for libraries.

John F. Blair, Publisher - 1406 Plaza Drive - Winston-Salem, North Carolina 27103 - 1-800-222-9796

178"Fall 1990







Public Library Evaluation:
A Case Study

James J. Govern

To evaluate the relative strengths of public
libraries, it is crucial that individual library sys-
tems evaluate themselves. Libraries unwilling to
evaluate programs, services and personnel " and
to make improvements based on those findings"
will eventually be forced to do so because of the
need for accountability and the struggle for scarce
Public dollars.

This article describes two approaches that
public libraries can use to evaluate their programs
and services: (1) the Childers and Van House
multiple constituencies model and (2) traditional

Output measures.

The Multiple Constituency Model of Evaluation

In an article in the October 1, 1989, issue of
Library Journal, Thomas Childers and Nancy A.
Van House list four approaches to evaluating
effectiveness within organizations: the goal model
as exemplified by output measures; the process
Model based on internal organizational health;
the open systems model, which measures an
ability to attract resources; and the multiple con-
Stituencies model, defined as the ability to meet

James J. Govern is director of the Stanly County Public
Library in Albemarle.

needs and expectations of certain groups.! In
their study, Childers and Van House explore the
multiple constituencies model. The authors iden-
tify seven groups to which public libraries are
accountable: trustees, community leaders, library
administration, library staff, patrons, friends, and
government officials. They consider these groups
to be oinfluential, directly or indirectly, in organi-
zation-level decisions.�?

The authors identify sixty-one key indicators
that typically describe what public libraries either
do or have. Their study shows that six of these
sixty-one indicators were reported in the top nine
responses for all constituent groups. Statistically,
the six most important indicators of library effec-
tiveness as revealed by this study are: staff helpful-
ness, services suited to the community, range of
materials, range of services, convenience of hours,
and materials quality. It is interesting that this
study discovered that size of library was not a
determinant in choosing what were viewed as
important oindicators of effectiveness.� The seven
constituent groups of small, medium, and large
libraries all viewed the same indicators as being

important to a library's effectiveness.
In a recent survey of library constituent

groups (staff, trustees, county administration,

Stanly County Public Library

1990

[Trustee] Evaluation of Library Service Survey Form

Listed below are the 6 leading indicators of library service as reported in a recent national survey of public library trustees,
community leaders, library administration, staff, patrons, friends and government officials. The groups selected these indicators
from a list of sixty-one othings� that libraries typically do. Please rate the effectiveness of our library on these indicators.

. Staff Helpfulness

. Services Suited to the Community
. Range of Materials

. Range of Services

. Convenience of Hours

. Materials Quality

Comments or Suggestions

Very

Effective Effective

I """""

i UT IEE UII IIE SSIES SSSI

Fall 1990"179





TABLE 1.
nN SS FE ES TA ESET
Effectiveness of Library Services
See ee rete Oe ae aR EL aie Se A a ee ee ee ee ee

Staff Services Suited Materials Range of Range of Convenience of

Constituent Groups: Helpfulness to Community Quality Materials Services Hours Averages
i
Patrons (n = 39) 4.80 4.50 4.50 4.20 4.50 440 448
Government Officials (n = 4) 4.50 4.50 4.00 4.00 4.00 4.25 4.21
Trustees (n = 6) 4.83 4.00 4.16 4.00 3.66 4.00 4.11
Staff (n = 18) 4.55 4.27 3.94 4.00 3.77 3.38 3.98
Library Administration (n = 1) 4.00 4.00 4.00 4.00 4.00 3.00 3.83
Community Leaders (n = 17) 4.29 3.93 3.82 3.52 3.64 2.94 3.69
Group Averages 4.50 4.20 4.07 3.95 3.93 3.66 4.05

nc PS SP SS SS SS

library administration, patrons and community
leaders), the Stanly County Public Library
attempted to apply the methodology of this
national study to discover the perceived level of
effectiveness of our public library within the six
areas. (The following is a copy of the survey form;
Table 1 highlights the results of those surveys. )

Interpreting the Results of Constituency
Surveys

The results shown in Table 1 indicate that
library patrons gave the library the highest aver-
age effectiveness rating among all groups sur-
veyed. The lowest average effectiveness rating was
turned in by community leaders. Other constitu-
ent groupsT effectiveness ratings fell somewhere
between those two groups. The indicator given
the lowest effectiveness rating on any of the six
individual areas was oconvenience of hours,� as
perceived by those within the community leaders
group. The highest effectiveness rating was given
to ostaff helpfulness,� by the library's board of
trustees. On average, the lowest rated area among
all areas for all groups was oconvenience of hours,�
and the highest effectiveness rating among all
groups for all areas was ostaff helpfulness.�

This survey has demonstrated in a concrete
way for me some things that, as library director, I
have assumed for some time: the library staff is by
and large viewed as being helpful, and public
service hours are not as convenient as they need
to be. What I did find surprising from these effec-
tiveness scores was that patrons gave the library
the highest rating and that community leaders
gave the library the lowestTrating.

It should be stressed that the ratings are
based on individualsT perceptions of how the
library behaves or operates. This survey did not
request explanations of the responses from those
providing the ratings. We have been able to ascer-
tain various group impressions of library service
areas, yet there was no specific information pro-

180"Fall 1990

vided to allow us to understand or analyze why
those individuals responded the way they did. I
liken this type of library evaluation to public
opinion polling with its strengths and weaknesses.
Further implementations of studies of this type
might be expanded to require comments as well
as the numeric rating for each of the six areas.
However, within these six areas, an understanding
of how your library is perceived in the community
is very useful information when considering goals,
objectives, and setting priorities for your library.

Output Measures

Another method that the small public library
can use to measure effectiveness is output mea-
sures as developed by the Public Library Associa-
tion.? These evaluation tools were developed so
public libraries could measure the results or out-
comes as opposed to input of their services. This
method of evaluation allows libraries to compare
their performance over time, to compare them-
selves with similar libraries, and to monitor
progress on their missions and objectives. Another
benefit of output measures is the ability to
describe to outsiders and staff alike the libraryTs
performance in specific areas. Historically, public
libraries reported input such as budget dollars
per capita or book budget dollars per capita. Out-
put measures is a way to measure performance.

The Stanly County Public Library staff has
surveyed library users each fall for the past three
years to determine how we were doing. Our library
chose to study the following five measures: title fill
rate (proportion of the titles sought that were
found); author/subject fill rate (proportion of the
authors/subjects sought that were found); brows-
ing fill rate (proportion of the time that browsers
found something); reference completion rate
(proportion of reference questions that were com-
pleted the day of the request); and the document
delivery rate (the length of time that patrons
must wait for requested materials). :





Table 2 shows the results of those surveying
Periods within those areas. The margin of error is
based on the usable sample size.

The reference completion and browsing fill
rates show no especially significant statistical
differences between the survey periods. The title
fill rate, however, has decreased from seventy-one
percent (or a range of sixty-eight percent to
Seventy-four percent) in 1987 to sixty-four per-
cent (or a range of sixty percent to sixty-eight
Percent) in the 1989 survey. Over the same time
the subject/author fill rate has increased from
Seventy-four percent (or a range of seventy per-
cent to seventy-eight percent) to eighty-one per-
cent (or a range of seventy-seven percent to
eighty-five percent). The movement in these fill
rates occurred during a time of little change in
Circulation per capita and collection turnover,
two factors which have the potential to affect
directly those measures. That is to say, the library
and its collection were essentially as busy in 1987
as in 1989.

What could be the possible explanation of the
Counter movements in these two measures? A
Collection evaluation study during fiscal year
1988-89 pinpointed several high-demand areas
within the adult nonfiction collection which
accounted disproportionately for a small part of
that collection. For example, the 610s accounted
for nine percent of the adult nonfiction circulation
during the collection evaluation period, yet that
area makes up only five percent of the adult non-
fiction collection. We made changes in the book
budget beginning with the 1989-90 fiscal year to
target those areas where demand and holdings
Were not in line. This change is one possible
explanation for the increase in the subject/author
fill rate. Within those areas of high demand,
Patrons began to see more of a selection.

over the three survey periods is more problematic.
Our title fill rate has decreased during a period of

improvement in the document delivery figures
and no significant increase in the level of reserve
activity. That is, patrons are waiting less time for
requested materials, yet the proportion of the
titles sought compared with titles found is de-
creasing. I do not have a plausible explanation for
this occurrence. One would think that, if a fill rate
(author/subject or title fill rate) showed a decline,
oneTs reserve requests might increase and docu-
ment delivery would slow down. This scenario has
not been our experience in Stanly County.

One of the most significant bits of information
gained from this round of surveying is to be found
in the document delivery rate; that is, how long
someone has to wait for requested materials
(reserves). The Stanly County Public Library has
demonstrated an improved ability to turn reserves
around more quickly. We delivered forty-eight
percent of requested titles within fifteen days of
the initial request date during 1989 as opposed to
only twenty-nine percent within fifteen days
during 1988Ts survey period.

Because of the poor showing on the document
delivery rate in prior years, the library made a
change in handling bestsellers and reserves. We
decided that we simply were not purchasing
enough copies of high-demand items. By buying
more copies, as well as adding non-reservable
copies of bestsellers, we were able to improve the
delivery rate in 1989 over the previous year. These
changes, in addition to a closer monitoring of our
reserve situation, enhanced our effectiveness in
this area. Reserve monitoring was accomplished
by having a staff member track the amount of
time materials were on reserve using the database
component of Appleworks on an Apple II-e. Keep-
ing this file up-to-date gave us a handy way to

Analysis of the decrease in the title fill rate judge quickly the demand for specific titles, as

TABLE 2.

Stanly County Public Library
Output Measures Results:
A Comparison of 1987, 1988 & 1989 Surveys

1987 1988 1989
BrowsersT Fill Rate 94% (+ 2%) 95% (+ 2%) 96% (+ 2%)
Subject/Author Fill Rate 74% (+4%) 77% (+4%) 81% (+ 4%)
Title Fill Rate 71% (+3%) 69% (+ 4%) 64% (+ 4%)
Reference Completion Rate 85% (+ 2%) 93% (+ 2%) 90% (+ 2%)
% of Requests Filled:
within 7 days N/A 18% 32%
8 to 14 days N/A 11% 16%
15 to 30 days N/A 24% 16%
more than 30 days N/A 45% 34%

Fall 1990"181





well as very immediately showing those items on
reserve for extended periods.

We also analyzed the results of our output
measures by comparing (see Table 3) averages for
selected measures~ with the national averages for
libraries serving populations between twenty-five
thousand and fifty thousand ® as reported in 1988
and 1989 by the Public Library Association. The
table is a listing of those comparisons.

TABLE 3.

Comparison of Output Measures

SCPL Average National Average

BrowsersT Fill Rate 95% 91%
Reference Completion Rate 89% 87%
Subject/Author Fill Rate 77% 78%
Title Fill Rate 68% 71%
% of Requests Filled:

in 7 days 25% 27%

in 30 days 58% 71%

The figures in Table 3 illustrate the averages
for three years of surveying (1987, 1988 and 1989)
for the Stanly County Public Library in these
selected measures, compared with the averages
from two years of surveying (1987 and 1988)
within libraries serving populations between
twenty-five thousand and fifty thousand. It is
noteworthy that our results on the fill rate mea-
sures are nearly the same as those averages for
similar-sized libraries participating in PLATs re-
porting for 1987 and 1988. Typically these libraries
were busier than ours in the circulation per capita
and collection turnover areas. This fact demon-
strates for us that our efforts with collections,
services, procedures, and so forth, that affect
these output measures have been as successful as
the efforts of libraries serving populations of
similar size.

The only marked difference is in the docu-

ment delivery rate where the Stanly County Public
Library did not have as high a fill rate as compar-
able libraries. On average, seventy-one percent of
the requested materials were filled within thirty
days in those libraries, whereas our fill rate within
that time frame was only fifty-eight percent. These
clues provided us with the information that led to
changes in our reserve procedures. It will be
interesting to see if those changes in procedure
will continue to raise that percentage.

Using output measures and surveying library
constituent groups to discover their perception of
library effectiveness are two constructive ways to
begin evaluation of services and programs. The
results of these activities have given the Stanly
County Public Library useful information in
making decisions ranging from the allocation of
the book budget to determination of the hours of
operation. Additionally, positive comments rela-
tive to staff helpfulness are always welcome. As
evaluation techniques become more refined and
easier to use, and as they become required by
funding authorities, they will become more and
more a part of the public library's normal planning
cycle. Public libraries willing to begin evaluating
now will be ahead of the game in years to come.

References

1. Thomas Childers and Nancy A. Van House. oThe Grail of
Goodness: The Effective Public Library,� Library Journal 114
(1 October 1989): 44-49.
2. Ibid.
3. Nancy A. Van House and others, eds., Output Measures for
Public Libraries: A Manual of Standardized Procedures 2nd
ed., (American Library Association, 1987).
4. James J. Govern, oOutput Measures Results: A Comparison
of 1987, 1988 & 1989 Surveys,� (Albemarle, NC: Stanly County
Public Library).
5. Statistical Report '89 (Chicago: Public Library Data Service,
Public Library Association, 1989).

(This is the published statistical report upon which the

average for comparably sized public libraries are based.) ol

182"Fall 1990





HOW TO STAY NUMBER ONE

North Carolina, youTre number them will complement your
one. This year, your Greensboro "_curricula"in the classroom and,
schoolsT libraries won the of course, in the library.
National School Library Media

Program of the Year Award, Want to find out more? Contact
exemplifying excellence in the people who help North
education. Congratulations! Carolina education the year

round: your local Britannica Field
The award was co-sponsored by _ Representatives.
Encyclopaedia Britannica
Educational Corporation, which Tim Burris
is also a leader in its field. It 1309 Ravenhurst Drive
makes and markets the nationTs Raleigh, NC 27615
most effective, most interesting, 919/846-8356
most forward-looking educational

products"from award-winning Chris Christy
films, videocassettes, and P.O. Box 1169
videodiscs to areference work on Irmo, SC 29063
a computer-driven CD. 803/781-4198

In fact, these are the very David Harrington

products that can help youremain 512 Brook Street
an educational leader. Many of Salisbury, NC 28144
704/633-0597

E=Brilannica

ENCYCLOPAEDIA BRITANNICA EDUCATIONAL CORPORATION

Fall 1990"183







Quantity is Not Necessarily Quality:
A Challenge to Librarians To Develop
Meaningful Standards of Performance

for Library Reference Services

Patsy J. Hansel

The idea of performance standards presumes
that there is a consensus in the library profession
about what good performance is. However, the
profession has been hesitant to evaluate reference
service qualitatively. We are constantly evaluating
reference service quantitatively, as if our reference
statistics really mean something, while in just
about any library you choose, the staff will freely
admit that their reference statistics are inaccu-
rate. And even if your library is one of those where
staff really do keep track of every question they
get, what does that mean?

Most administrators seem to adhere to the
simplistic view that the more questions you
answer, the better you're doing. This sort of
reasoning is rampant in all areas of library admin-
istration. We cling to the belief that quality is
quantity. This assumption certainly makes evalua-
tion simple: as long as the numbers are increasing,
the library must be doing a good job. This belief is
based on another premise of library administra-
tion " that funders respond to simplistic notions
(the bigger the numbers, the better the library's
doing), and that it isnTt worth the trouble to try to
explain more complicated rationales to them.
This is probably true, but it doesnTt excuse library
administrators for basing their internal decisions
on such simplistic notions.

If a reference staff answers a large number of
questions, that could mean simply that the library
is so incomprehensibly organized that users can-
not find anything on their own. If such a library is
reorganized, the number of reference questions
answered could actually decrease, while the users
of the library receive better service, being able to
find things more quickly for themselves.

Perhaps a less sophisticated group of users

Patsy J. Hansel, former Assistant Director of the Cumberland
County Public Library and Information Center, is Director of
the Williamsburg Regional Library in Williamsburg, Virginia.

184"Fall 1990

asks more questions in the library " perhaps they
ask fewer, because they are intimidated. Perhaps
the libraries that get the most questions also get
the easiest ones. Fewer, more complicated ques-
tions can take more time than lots of easy ques-
tions. A staff can get a large number of questions
and direct people to sources rather than helping
them or teaching them through the process.
Another staff can get the same number of ques-
tions but take the time to go the extra mile and
really help the patron. And thereTs always the
reference librarian who isnTt going to take less
than twenty minutes to answer any question,
regardless of how simple, and regardless of how
frustrated the patron and fellow staff (taking up
the slack) get in the process.

Reference service is too complex and too
important to be judged simply on the basis of how
many reference questions any group of people

answers. At some point, we have to deal with the
quality of that service. If we donTt know what

quality we have, we have no way to determine if
we're improving or getting worse, what kind of
training for the reference desk works, and whether
or not individual reference librarians are doing a
good job.

The first attempt to confront the question of
the accuracy of reference service in libraries was
reported in 1969.1 The researchers used oanony-
mous shoppers� to ask questions of reference
librarians in public libraries, and the results were
disappointing. The authors reported the following
shortcomings: ominimal interest in exactly what
inquirer is seeking, failure to recognize fairly well-
known titles, undue dependence on somewhat
outdated books rather than on current reports in
answering requests for recent information, a
concept of resources limited to the book or at
most the book and the magazine rather than to
the full range of communication media, and lack
of initiative on the part of staff in seeking material







from another source if the local library does not
have it.�2

Twenty years later, there is no indication that
libraries are doing any better. In his 1984 article
on reference evaluation, Alvin Schrader concluded
that ounobtrusive procedures have not yet become
a component of the standard methods for evalu-
ating library and information service perform-
ance.� He continued, oThe problem of the lack of
commitment to reference service excellence will
neither go away nor be resolved by the kind of
Passive approach which has so far characterized
Our efforts. Researchers, educators, and practi-
tioners must, first and foremost, acknowledge the
existence of problems with respect to reference
Service accuracy. This acknowledgment has not
yet occurred on a wide scale. Until it does, until
Our community is prepared to take seriously the
Call for reference service accuracy, unobtrusive
Performance measurement will remain as the
next frontier for library and information services.
As of now, we are still in the age of misinforma-
tion.�4 The editors of the collection in which
SchraderTs article appears have an even more
Succinct analysis of our current situation, oIt is
Suggested by the papers in this collection that one
reason librarians suffer the ignominy of low
Salaries and even lower community respect is that
they do so badly at their work.�5

In 1985, Terence Crowley, the originator of
ounobtrusive� questioning to determine reference
accuracy, summarized the research in his article,
oHalf-Right Reference, Is it True?�® He concluded
that although unobtrusive methodology had been
accepted by researchers, it had not yet become a
tool for evaluating reference service in the field,
and he expressed concern: oUntil librarians deal
effectively as a profession with the many and
Seemingly endless sources of error in reference
Work, we will remain passive observers of popular
Culture. Some of us will provide timely, appro-
Priate, and consistently accurate information, but
the institution in which we work will not be
fulfilling its potential role in the information age.��

Many reference administrators continue to
Object to the use of unobtrusive testing on ethical
grounds. What I personally find unethical is adver-
tising a service which is often of questionable
quality. However, the library profession has had
More than twenty years to adopt unobtrusive
testing as a method of evaluating reference service
and has not done so, and there is little likelihood
that it will become an accepted method of library
evaluation anytime soon. That is why I was so
interested several years ago to read of the
Wisconsin-Ohio Reference Evaluation Project

ee = tk ee Sy

being developed by Charles A. Bunge from the
University of Wisconsin-Madison and Marjorie E.
Murfin from Ohio State University.T

Reference service is too com-
plex and too important to be
judged simply on the basis of
how many reference questions
any group of people answers.

At that time, the program had been used in a
number of academic libraries and was being
tailored for public libraries. I was working at the
Cumberland County Public Library & Information
Center (CCPL&IC), and decided to contact Dr.
Bunge to have CCPL&IC become part of his pro-
gram. What we called the oBunge forms� were
used at CCPL&IC during 1988. The process is
simple. The library receives a set of two-part
forms, one part for the patron, a corresponding
part for the librarian. When a patron asks a
question, the librarian gives the patron one part
of the form, and the librarian makes a quick note
of the question on the second part of the form.
After the transaction is completed, the librarian
fills out the rest of the form, answering such
questions as how busy the library was when the
transaction occurred, how difficult the question
was, how many sources were consulted, and
whether the question was answered or not. The
patron part of the form includes demographic
information as well as questions about how busy
the librarian seemed to be, how difficult the ques-
tion was, and whether the patronTs question was
answered. When all the forms are completed, they
are sent to Dr. Bunge and analyzed, and the
library receives a lengthy report detailing the
libraryTs performance and comparing it to that of
other participating libraries. As a person who had
also been involved in unobtrusive testing of refer-
ence service at the same library, I found the
Bunge-Murfin program to have many of the same
benefits that unobtrusive testing has, without
being nearly as time-consuming or potentially
threatening to staff.

I did work the reference desk during some of
the time that the forms were being used in Cum-
berland, and I would like to share one experience
that illuminated for me how we often cannot
trust our own perceptions of whether or not we
are doing a good job at the reference desk. A
patron asked me a question about government
grants. I asked some follow-up questions and
decided that the reason that I could not get a

Fall 1990"185





clear picture of what the patron wanted was
because the patron herself was not certain. I did
what I usually do in those situations: I gave her a
reference book to. start with and asked her to
return for further help if she needed it. She did
not return for further help, but did return the
book, and at the time gave me her part of the
form. I asked her if she had found what she
needed and she said yes. After she left, I looked at
the form.In the part where she was to indicate if
her question had been answered, she had re-
sponded no. Some of us have long suspected that
the many positive evaluations that libraries re-
ceive from patrons are not entirely related to
reality. In this case, a patron was willing to be
honest on a form, even one that she was handing
directly to the person who had failed to help her,
at the same time that she was not willing to be
honest with that person face-to-face. That one
interaction was enough to convince me that using
the Bunge forms would give us information that
we were not getting with our self-evaluation
methods, such as the number of reference queries
answered within twenty-four hours.

A frequent challenge to tests of reference
accuracy is that they employ factual questions
that are not typical of those asked in libraries,
that the majority of questions asked in libraries
are more complicated, and that librarians do very
well in answering them. In the libraries that have
participated in the Wisconsin-Ohio Reference
Evaluation Program, eighty-five percent of the
questions asked were not strictly factual, and
librarians were oless adept� at answering these. In
BungeTs view, othatTs to be expected, because the
~non-factualT questions are less definite, and the
opportunities for patron dissatisfaction are
greater.�®

The other most exciting work being done in
the area of reference evaluation is that developed
by Ralph Gers and Lillie Seward when both were
at the Maryland State Library.!° The Maryland

Many reference administra-
tors continue to object to the
use of unobtrusive testing on
ethical grounds. What I
personally find unethical is
advertising a service which is
often of questionable quality.

program involved unobtrusive evaluation in public
libraries throughout the state to determine the

186"Fall 1990

rate of reference accuracy. As part of the survey,
the researchers observed the behaviors that
librarians used during reference transactions, and
then determined the behaviors that were asso-
ciated with success in answering the questions.
Next they developed training sessions for refer-
ence librarians (administrators came, too) based
on what they deemed to be the most effective
behaviors that they saw used during the unobtru-
sive survey. Those who participated in the work-
shops were encouraged to return to their libraries
and train others. Following the workshops, the
libraries were unobtrusively surveyed again. The
results: libraries that had participated in the
training had better success rates than those that
had not.

Ralph Gers is now working independently,
and for a fee, any library or group of libraries can
contract with him for the workshops, the unobtru-
sive testing, or both. Although the cost for the
package is high by library standards, the training
is intensive and often very productive. Gers
reports that he has just had his first one hundred
percent library " after the training, this library
answered every question correctly in the follow-up
unobtrusive survey.T�

While we can use the two methods mentioned
above to evaluate reference service in our libraries
and develop training to improve that service, the
performance standards that we develop must
also consider the environment in which reference
service occurs. Library administrators must admit
that their reference staffs are frequently asked to
be far more than reference librarians. Perhaps the
most difficult situations arise in those libraries
where there is no separate security staff, so that a
reference librarian is required one moment to be
courteous and helpful with a reference patron,
and the next moment must become The Enforcer,
instructing a disruptive patron about the conse-
quences of continued unacceptable behavior in
the library. Add that to the fact that administra-
tors frequently ask their staffs to do too much, to
work too many hours at a public service point,
and we may have a formula for failure.

A recent article in RQ refers to the extremely
low morale that has been observed in many library
reference departments.!3 The article begins with a
summary of the research that has shown acorre-
lation between the morale of workers in various
jobs and their performance. The article then
details a study by Ralph Lowenthal using various
instruments to survey four public library reference
staffs to determine the level of their job satisfac-
tion. Following that survey, he used the Wisconsin-
Ohio Reference Evaluation Program to determine





the rates of reference success in those libraries.
Not surprisingly, measures of job satisfaction such
as perceived tension, stress and strain, emotional
exhaustion, and disaffection ~from patrons were
correlated with lower levels of reference perform-
ers. Conclusion: if a reference librarian is unplea-
sant for other staff to be around, that person
probably isnTt giving very good reference service,
either. If an entire department is suffering from
stress and strain, reference service in that library
is probably suffering. Our performance standards
must address the question of what volume of
work a reference librarian can reasonably be
expected to perform, both on a public service
desk and off.

As a profession, we should no
longer be content to assume
that our libraries are giving
good service.

Conclusion

Libraries have been reluctant to evaluate
reference services qualitatively. Perhaps this is
Partially because such evaluation is difficult.
Numbers, although they may be suspect when we
examine them closely, are usually fairly easy to
acquire. Perhaps it is also because by evaluating a
Service as oprofessional� as reference service, we
are risking the discovery that our service, and
therefore our profession, isnTt always everything
we'd like to think that it is.

Until we are willing to evaluate reference
Service in our libraries, we can have no empirical
basis for determining what level of reference
Service we are giving. We will continue to have
Only our mushy assumptions as a profession about
what standards of performance we should expect
from our staffs and from our libraries.

As a profession, we should no longer be
content to assume that our libraries are giving
Zood service. We must take the responsibility for
giving the good service that we persist in telling
the public that we are offering. To do that, we
must first determine what level of service they are
receiving. Then we must do all that we can to
Maintain excellence when we have it, and to work
toward better service when we find that service
lacking.

References
1. Lowell A. Martin, assisted by Terence Crowley and Thomas
Shaughnessy, Library Response to Urban Change (Chicago:
American Library Association, 1969).

2. Ibid., 28.

3. Alvin M. Schrader, oPerformance Standards for Accuracy in
Reference and Information Services: The Impact of Unobtrusive
Methodology,� in Bill Katz and Ruth A. Fraley, eds., Evaluation
of Reference Service (New York: The Haworth Press, 1984): 208.
4. Ibid., 210.

5. Katz and Fraley, 4.

6. Terence Crowley, oHalf-Right Reference: Is it True?� RQ (Fall
1985): 59-68.

7. Ibid., 67.

8. Charles A. Bunge and Marjorie E. Murfin, oReference Ques-
tions " Data from the Field,� RQ (Fall 1987): 15-18.

9. Charles A. Bunge, letter to the author dated May 4, 1990.
10. Ralph Gers and Lillie Seward, oImproving Reference Per-
formance: Results of a Statewide Study,� Library Journal
(November 1, 1985): 31-35.

11. For one professorTs objection to the Maryland methodology,
see oData abuse in reference report,� letter to the editor from
Thomas Childers, LJ (April 15, 1986): 10.

12. Ralph A. Lowenthal, oPreliminary Indications of the Rela-
tionship between Reference Morale and Performance,� RQ
(Spring 1990); 380-393.

13. Telephone conversation between the author and Ralph

Gers, May 10, 1990. Al
is

SALEM PRESS/MAGILL BOOKS

"A Reputation for Reference"

Be sure your library has
these important collections

: viestenmiates. °° '-o* si
Masterplots II

History II
Science

Cinema

Critical Surveys
Literature

Literary Annuals
Bibliographies

RALPH DAVIS
Sales Representative
P.O. Box 144
Rockingham, NC 28379

Telephone: 919/997-4857
Fax: 919/997-3837

Fall 1990"187





YZ Librarians,
When your library ¢
needs childrenTs
books,

why not consult
with a specialist?

188"Fall 1990

At Book Wholesalers, we

speciaize in supplying libraries

with childrenTs books. We are ata Sonus
large enough to supply you with every childrenTs
book you need-yet small enough to offer you
personalized, dedicated service. Quite simply, we
work with you to make sure you will never have to
worry about childrenTs books again.

We offer you:

eOne source ordering
eTriple checks on all orders

¢30 day delivery or status report of
order guaranteed

eSubject listings of books
Customized paperwork
e Standing order plan

eRepresentative visits to your library
to assure great service

eElectronic ordering: convenient toll-free
ordering by FAX, telephone or computer

Our goal is to delight you with our service.

=

BOOK WHOLESALERS, INC.

2025 LEESTOWN RD. / LEXINGTON, KY. 40511
606/231-9789, 1-800/888-4478, FAX 1-800/888-6319

Contact us today and speak with one of our represen-
tatives about how we can end your worries when
ordering childrenTs books!







Performance Measures in
Youth Services

Rebecca Sue Taylor

How many times have you been told by collea-
ues, library school professors, supervisors, and
Journal articles that you need to talk like, think
like, dress like, and act like an administrator in
Order to get your department's fair share of funds,
Power, and respect?

How many times have you thought, oITve got
More important things to do: plan the storytelling
festival; get out the Toddler Time publicity, meet
with the school librarians. I donTt have time to
Play administration games.�

Is there any reasonable way to balance your
real work with what you must do to justify that
Work and ensure its continued funding and
Support? Here are a few suggestions to bring your
life back into balance.

Step One

The first item of business is to stop and make
an attitude adjustment. Taking time for study,
research, reading, planning, and just plain think-
ing is a legitimate use of your time. You are a
Manager or supervisor because someone values
your knowledge and experience. Taking time to
read the current professional literature, attend
Professional meetings, and talk to colleagues is as
Much a part of your job as making sure there are
enough reading records to last all summer.

A good place to start your reading is Barbara
T. RollockTs Public Library Services for Children!
Published in 1988. It is probably the most current
Overview of the functions and methods ascribed
to ochildrenTs services� since Dorothy BroderickTs
Library Work With Children was published in
19652 It is interesting that even in the 1977
revision of BroderickTs work (the obible� for many
of us now in the management levels of childrenTs
Services) there is absolutely no mention of
Management, the planning process, output
Measures, or evaluation processes.

RollockTs work, on the other hand, focuses
Considerable attention on the management con-

Cerns of a childrenTs librarian. She asserts that the
De ene Mal SERS Nal

Rebecca Sue Taylor is Coordinator of Youth Services for the
New Hanover County Public Library, Wilmington, N.C.

major responsibility of a manager of public library
childrenTs services is to keep in touch with the
ideas, concerns, and planning taking place in the

entire field of librarianship, not just within youth
services.T

... the major responsibility of
a manager of public library
childrenTs services is to keep
in touch with the ideas,
concerns, and planning taking
place in the entire field of
librarianship ...

Rollock covers national and state-wide
standards as well as the development of perform-
ance measures as they apply to childrenTs and
youth services. She also presents a succinct and
readable chapter on internal and operational
management concerns. oResting too comfortably,
perhaps, on a tradition of success, practitioners
of services to children have failed to offer objective
proof of their techniques for measurement and
evaluation.�4 Rollock discusses funding, staffing,
setting goals and objectives, public relations, and
concludes with the assertion that childrenTs
librarians need to develop goals and objectives,
train staff to meet these written standards, and
evaluate carefully departmental services in terms
of the successful completion of these written
goals and objectives. When one has carefully
followed these steps one is in a far more effective
position to demand an appropriate budget and to
spend it effectively.®

Step Two

Next you need to take some time to familiarize
yourself with the language and processes that
your administration is using. Take time to ferret
out your libraryTs copy of A Planning Process For
Public Libraries.® Published in 1980, this work
replaced the national standards by which public

Fall 1990"189





libraries measure and judge their services. It is
fairly technical and at times difficult reading, but
even if you havenTt already been through some
part of the oplanning process,� you will eventually
have to understand such terms as odata collec-
tion,� ocommunity survey,� and even the ubiqui-
tous opreschool door to learning.�

Once you have at least a general understand-
ing of what the planning process is and how it
may involve childrenTs services, take a look at the
two manuals that were put out to enhance and
supplement the original process. Planning and
Role Setting for Public LibrariesT provides the
tools to begin an actual planning process as well
as numerous sample forms to be adapted for local
use. The chapter on writing goals and objectives is
particularly good, presenting clear and practical
methods for creating a framework upon which to
hang future methods of evaluation, while accept-
ing the fact that not all libraries or library systems
will choose to expend the same amount of time
and staff resources on the process.

Even more important is Output Measures for
Public Libraries.T The chapter on data collection
is excellent and should give you numerous ideas
for the types of surveys that might be done within
a childrenTs services department. Chapter 4,
oInterpreting and Using the Results,� looks suc-
cinctly and rationally at what you may want to do
with the statistics you have collected. Numerous
types of forms for data collection are appended.

If the library does not have a
capacity for self-criticism and
change, an evaluation may
only be an exercise in futility.

Step Three

Now that you are familiar with the reasons
for internal measurement and the planning pro-
cess, its structure, and terminology, you need to
take some time to explore the types of things that
you may be able to measure effectively. In the
excellent and thought-provoking article,oResearch
and Measurement in Library Service to Children,�®
Adele M. Fasick asks, oWhy have librarians en-
gaged in serving children been put on the defen-
sive about the way in which they evaluate their
services, and what can be done to bring childrenTs
services back into the mainstream of library
thinking?� Her article discusses the problems in-
volved in using conventional quantitative mea-
surement techniques to measure the types of

190"Fall 1990

services provided by a childrenTs services depart-
ment:

Although some of the reaction against quantitative
measurement of childrenTs services may be overly emo-
tional, it is not true, as one of my colleagues once
suggested, that ochildrenTs librarians are people who love
children and hate statistics.� There are some good rea-
sons for protesting against the imposition of quantitative
standards on childrenTs work. There are problems in the
evaluation of library services to children that simply do
not exist in other types of library work.!°

Adele Fasick also poses a number of youth-service-
specific research questions which need to be
measured and evaluated in ways different from
the usual systemwide comparative measurements.

Probably the most important single article on
measurement and evaluation of childrenTs services
is Mary K. CheltonTs oEvaluation of ChildrenTs
Services.� ! After an excellent review of the history
of prescriptive standards and the development of
the planning process, she spends considerable
time discussing just what evaluation is and is not.
Among her heartening and realistic assertions are:

1. Evaluation is not the way by which oneTs
ultimate worth is measured.

2.T Evaluation is not always complicated.

3. Evaluation will not always prove what
you want it to.

4, Evaluation is not always quantitative (i.e.,
counting things) even when the results are pre-
sented and analyzed numerically.

5. Evaluation does not solve problems; it
only provides the evidence needed to solve
problems.?2

Chelton continues with a detailed summary of
specific types of evaluations; appended are sample
instruments. The article concludes with the astute
and to-the-point statement that: oThe fact that a
program has clear measurable objectives, valid
measures, and sufficient resources to document
itself does not ensure a successful evaluation
although all those factors must be present in
order to do one.... If the library does not have a
capacity for self-criticism and change, an evalua-
tion may only be an exercise in futility.� 8

Two other articles that are worth finding and
studying point out some areas for evaluation that
are specific to childrenTs services. Diana YoungTs
oEvaluating ChildrenTs Services�!4 presents a
pertinent survey of questions every youth services
administrator should ask and includes questions
on facilities, materials, programming, and services.

Lesley S. J. Farmer's oUsing Research to Im-
prove Library Services�! points up a possible
avenue of additional research and reading in





discussing the Dallas Public LibraryTs survey of
effectiveness of preschool story hour delivery sys-
tems. Some public libraries have used current
research in the field of child development to
design programs that require the active involve-
ment of parents.!® The operative phrase here is
ocurrent research in the field of child develop-
Ment.� Certainly it is an avenue more public
library childrenTs librarians need to pursue.

Is there any reasonable way to
balance your real work with
what you must do to justify
that work and ensure its con-
tinued funding and support?

Step Four

If you are convinced by now that you need to
be doing some type of evaluation or statistical
Measurement of your departmentTs services, you
May want to track down Output Measures for
ChildrenTs Services in Wisconsin Public Libraries
by Douglas Zweizig and others.!� This report on
the methods and results of a survey of childrenTs
Services in WisconsinTs public libraries was not
Widely disseminated. It is well worth the time and
effort it may take to obtain it through interlibrary
loan from the Wisconsin State Library. The report
includes several excellent measurement instru-
ments that can be adapted for local use as well as
detailed instructions on specific methods of data
Collection.

Items such as the oCensus Work Form� !8 and
the oIn-Library Materials Use Log�! are well de-
Signed and the reportTs data summaries will give
you some basis for comparison. One hopes that, in
the near future, other states and library systems
will replicate the survey, at least in part, and will
begin to publish the results so that additional
Comparisons can be made.

Step Five

Now itTs time to customize and tailor all your
Teading, research, and thinking to your libraryTs
Specific needs. Make a list of the things you want
your department to accomplish (your goals). Use
your library's overall mission statement and goals
but take time to think through the specific aims of
your special service area as well. Are the activities
and projects that your staff spends the most time
On clearly reflected in a position of priority in

your goals? Are they the things you want your
department to be committed to accomplishing?
Can you do them well?

Once you have a list of goals (or statements of
the services you want to provide), take a look at
the steps you must accomplish. As you list the
steps (your objectives), think about how you can
measure whether you have successfully achieved
each step.

Step Six

Finally, itTs time to accumulate some statistics
and write some reports. Surprisingly, this may be
the easiest part of the process. You may want to
replicate parts of the Wisconsin Output Measures
for ChildrenTs Services project, or you may want
to use methods presented in Output Measures for
Public Libraries. Have other systems in your area
done surveys that you can adapt or replicate? Is it
possible to adapt an instrument or process al-
ready in use in another department of your
library, such as a reference question fill rate
survey, to your needs?

Remember that your instruments don't have
to be complicated and exotic to measure some-
thing. You may already have access to some of the
numbers you need from monthly or annual re-
ports of activities, such as program attendance
and circulation statistics.

Do you have access to statistics specific to
childrenTs services already accumulated in reports
and evaluations you have done in the past? A
variety of measurements might be made from
information routinely kept on summer reading
club membership and reading accomplishments.
An analysis of staff time-use patterns might be
made from existing evaluations of programs that
include a breakdown of the hours spent on plan-
ning, publicity, performance, and evaluation.

DonTt forget to tailor your measurements toa
childrenTs services perspective. For instance, does
measuring questions asked (i.e., reference and
directional questions) actually show the number
of personal contacts your staff had with patrons
each day? Would it be better to measure ocontacts�
that include such interactions as helping a pre-
schooler put together a puzzle or explaining
summer reading club rules to a second grader?

Finally, control your evaluation instruments.
DonTt let them control you. If you wait to be told
to do a fill rate survey, you will also probably have
to use the form devised by your administrative
team for use throughout the system. If you have
already designed and made an evaluation that is
meaningful to your special service population, you

Fall 1990"191





will have done a better job of representing them
and their needs to your administration.

Make the time to explain to your supervisor,
director, or administrative team what you are
doing, why you are doing it, and how successful
you are in doing it. Use numbers, charts, and
graphs, but donTt forget to include a narrative
that compares and summarizes your results. Put
your results and conclusions into written reports
so that they become part of the official body of
information used to make future planning deci-
sions. In the end the thoroughness with which
you document your programTs activities, needs,
and successes will have a direct result on the
resources you will have to devote to story times,
reading clubs, storytelling festivals, and the rest of
your oreal� work.

References
1. Barbara T. Rollock, Public Library Services for Children
(Hamden, Conn: Library Professional Publications, 1988).
2. Dorothy M. Broderick, Library Work With Children (New
York: H. W. Wilson, 1977).
3. Rollock, 27.
4. Ibid., 46.

5. Ibid., 70.
6. Vernon E. Palmour, Marcia C. Bellassai, and Nancy V. De
Wath, A Planning Process For Public Libraries (Chicago:
American Library Association, 1980).

7. Charles R. McClure, et al., Planning and Role Setting For
Public Libraries, 2nd ed. (Chicago: American Library Associa-
tion, 1987).

8. Nancy A. Van House, et al., Output Measures For Public
Libraries, 2nd ed. (Chicago: American Library Association,
1987).

9. Adele M. Fasick, oResearch and Measurement in Library
Services to Children,� Top of the News 35 (Summer 1979): 354-62.
10. Ibid., 355.

11. Mary K. Chelton, oEvaluation of ChildrenTs Services,�
Library Trends 35 (Winter 1987): 463-84.

12. Ibid., 465-67.

13. Ibid., 474-75.

14. Diana Young,oEvaluating ChildrenTs Services,� Public
Libraries 23 (Spring 1984): 20-22.

15. Lesley S. J. Farmer, oUsing Research to Improve Library
Services,� Public Libraries 26 (Fall 1987): 130-31.

16. 7bzd,, 131,

17. Douglas L. Zweizig, Joan A. Braune, and Gloria A. Waity,
Output Measures For ChildrenTs Services In Wisconsin Public
Libraries: A Pilot Project " 1984-1985 (Madison: Wisconsin
Division for Library Services, 1985).
18. Ibid., 13.

19. Ibid., 18.

aD
nae

WE'RE BIG

IN

SMALL PRESS

@ 700 Presses

@ 5,000 Titles

@ All in Stock

@ Adult Non-Fiction

@ Annotations Services

@ Preview/Approval Plans

QUALITY BOOKS INC.

JOHN HIGGINS

SALES
REPRESENTATIVE

192"Fall 1990

Toll Free
Call Collect

1-800-323-4241
312-295-2010







Performance Measures and
Technical Services:
Efficiency and Effectiveness

Karen S. Croneis and Linda H. Y. Wang

Librarians are currently using performance
measures to evaluate services. Because libraries
are complex organizations with many interrelated
departments, any evaluation of public services
activities is also an implicit evaluation of technical
services policies and procedures. Therefore, it is
important for technical services librarians to be
familiar with the concepts of performance
Measurement.

Efficiency and Effectiveness

Historically, the two components of evalua-
tion have been efficiency (doing things right) and
effectiveness (doing the right things). These two
exist in an inverse relationship, that is, increased
efficiency generally results in decreased effective-
ness. For example, providing fewer access points
may speed up cataloging but it also tends to
decrease the userTs chances of finding information.

F. W. Lancaster! has declared that, theoret-
ically at least, technical services can be evaluated
from two viewpoints. The first deals with internal
efficiency.

As production units, technical services have
valued efficiency and based their evaluations on
that fact. Cost and productivity have been the
primary considerations in evaluating internal
efficiency. Technical services librarians have well
documented their success as efficiency experts.
There are many studies on a wide variety of
technical services activities: time studies, cost-
benefit analyses, vendor and systems evaluations,
and others.

Lancaster also posits evaluation based on
effectiveness, that is, on the long-range effect that
technical services have on the public services of
the library. The title of an excellent book, Cost
Effective Technical Services, provides an example

Karen S. Croneis is head of the Physics-Mathematics-Astron-
Omy Library at the University of Texas at Austin. Linda H. Y.
Wang is Reference Librarian at the University of South
Alabama, Mobile.

of how easily the two concepts of efficiency and
effectiveness can be confused. Papers and case
studies examined cost-efficiency (doing things
more inexpensively) without a corresponding
discussion of effectiveness. The question oIs this
the right thing to do?� was generally not
addressed.

Few studies examine the effectiveness of
technical services, that is, their impact on public
services. Granted, the number of studies on infor-
mation-seeking behavior, book availability, and
document delivery have increased significantly in
the last fifteen years. These have not been as
widely discussed in the traditional technical
services literature because of the technical/public
split.

Activities involving users have been seen as
strictly opublic services� issues and, therefore, onot
technical services� issues. Likewise, otechnical�
equals onot public.� Again, effectiveness has been
the domain (and the problem) of public services
while technical services have been concerned with
efficiency.

This efficiency/effectiveness discussion has
also been phrased in terms of quantity/quality. In
discussing the trade-offs inherent in technical
services, Carol Mandel concludes, oA formal and
quantitative approach to analyzing questions of
quality and productivity in technical services will
result in a net benefit to library users.�?

User Groups

To be valid indicators of library effectiveness,
performance measures must incorporate user
data.T Recognizing that users are individuals
whose information demands may not match their
information needs, the first step in using perform-
ance measures is to identify broad-based user
groups.

When the term ouser group� is mentioned,
most technical services librarians think of a
vendor- or system-based user group (e.g., NOTIS
User Group, Music OCLC User Group). In the

Fall 1990"193





context of performance measures, however, the
term ouser group� refers to the people who ouse�
the results of the complex set of activities called
technical services (acquisitions, cataloging, serials
control, physical processing, binding, and preser-
vation).

Technical services librarians have at least
three user groups " the general public, public
services librarians, and network users.

The first group consists of the opublic,� the
well-defined group (or groups) of people who use
a particular library for a particular reason. Most
libraries create promotional and informational
pieces that list various services targeted to specific
market groups.

The second group of users that technical
services librarians serve are the other staff mem-
bers, primarily the public services librarians, at
their own institutions. While the information
needs of these two groups are different, they have
the same basic demands: timely receipt and pro-
cessing of materials, and easy access to the infor-
mation contained therein. In this situation, time-
liness can be seen as a measure of efficiency;
access, of effectiveness.

Network members constitute a third user
group. These people use the data records that
technical services people have created and con-
tributed to a cooperative database. The records

... any evaluation of public
services activities is also an
implicit evaluation of
technical services policies
and procedures.

might be used for shared cataloging, acquisitions,
interlibrary loan, etc. Technical services librarians
can easily identify the needs of this user group,
primarily because they are also members of it.

OPAC as Common Ground

Developing performance measures for these
groups is challenging. Within each group are sub-
groups. Because individuals have many different
information needs, they can belong to more than
one subgroup. But, in an online environment,
nearly all groups will use the online public access
catalog (OPAC) to meet their needs. The auto-
mated catalog, othe keystone that joins the two
areas of technical and public services,�4 is chang-
ing the way people use libraries. New user behav-
iors and expectations, in turn, are forcing libraries
to reevaluate their operations.

194"Fall 1990

In 1985 Barbara Markuson' noted that most
of our efforts have been devoted to automating
the library and the functions of librarians, not to
automating access and retrieval systems for our
users. OPACs, like card catalogs, are windows on
collections and gateways to information. As such,
an OPAC must be evaluated in terms of the infor-
mation it contains, how easy it is to use, and how
effective it is.

Catalog Use

Research on catalogs generally falls into two
areas: catalog use studies and catalog user studies.
In their informative 1983 review article on catalog
use studies, Pauline Cochrane and Karen Markey®
categorized the questions regarding online cata-
logs and then identified successful methodologies
for studying each category. In doing so, they
provided a framework that continues to serve as
well.

Users expect, quite justifiably, that an OPAC
will provide at least as much information or
access as the card catalog. Gunnar KnutsonT
compared an online catalog with an existing card
catalog to detect levels and types of errors on four
access points: names, titles, series, and subjects.
The online catalog had a lower failure rate in all
areas except series. Knutson checked 200 biblio-
graphic records with 905 online access points and
found 23 errors, an overall error rate of 2.54
percent. Using KnutsonTs figures to extrapolate
for 500,000 records, one would expect to find
2,262,500 online access points and about 57,500
errors, a raw number that most librarians and
users would find absolutely unacceptable.

Consistency studies (subject cataloging and
classification) and availability studies can give
useful performance measures. In a recent study
based on Paul KantorTs availability analysis,
Deborah Barreau® identified four catalog prob-
lems that are most likely to interfere with patron
success with the OPAC: (1) incomplete location
information on the bibliographic record; (2)
incomplete holdings information in the database;
(3) special characters and punctuation in the
index fields that were interpreted incorrectly by
the search program; and (4) inadequate access
points and display of fields in the default format.
The terms oincomplete,� oincorrect,� and oinade-
quate� underline the fact that quality control is
essential. Performance measures can be useful in
quality control situations.

Catalog Users
Charles Hildreth® has pointed out that othe





online catalog stands apart from earlier catalogs
because it is interactive, infinitely expandable,
and public.� There is no question about the poten-
tial of the online catalog as a tool for rapid,
convenient, and comprehensive research.

Unfortunately, there is little evidence that
searching an OPAC (as opposed to a card catalog)
increases a user's success in finding information.
Cochrane and Markey� also concluded that ohow
the user and system interact is the important
thing, not that the interaction occurs ~onlineT.�

Studies of catalog users have focused on
information-seeking behaviors but no conceptual
model of user behavior has been developed. Until
that time, data will still be only indicative and
Situation-specific.

Patrons use online catalogs differently from
the way they use card catalogs, particularly for
subject searching. In a recent study, Micheline
Hancock!! found that users adapt their search to
the structure of the tools available. A major
obstacle to effective subject searching may lie in
the lack of interaction among the indexing lan-
guage, the classification scheme, and the actual
titles.

Simply put, query terms generated by users
do not match catalog subject entries. The likeli-
hood that any two people will use the same term
for a concept or a book, or that a searcher and an
information system will use the same term for a
concept, ranges from ten to twenty percent.�

Understanding the information-seeking be-
havior of users is crucial in designing an online
Catalog that complements the search strategies of
its users. Because OPACs are opublic,� search
Strategies can be recorded on transaction logs,
examined and analyzed to determine what it is
that users actually do in the search process.
Success can be noted and problems identified. In
One recent project, Thomas Peters found failure
rates (defined as those searches that produced
zero hits) of approximately forty percent for all
types of searches.

From such studies, librarians can establish
baseline data and then compare those numbers
with future performance measurements. Peters
Suggests that librarians use the information to
develop bibliographic instruction programs and
design OPAC teaching sessions that address the
most prevalent problems. Bibliographers would
be interested in summaries of the types and
Subjects of materials sought by OPAC users, both
items in the database (for possible duplication)
and items not in the database (for addition to the
Collection).

Marcia Bateso urges librarians to rethink

subject cataloging in an online environment.
Access should be determined by the total mix of
pre-existing and added osearch capability� index-
ing. This osuperthesaurus� would be designed and
geared to the needs of users rather than indexers.
A very active area of current research is the
examination of expanded subject headings based
on systems such as the Library of Congress, Dewey
Decimal Classification, and PRECIS.

Clearly, todayTs technical services librarians
are, or will become, database managers and pro-
viders of value-added services. In the future,
oinformation resources will be almost seamlessly
interfaced so that the public has direct, timely,
and effective access to what it needs to know.�

It is the librariansTs job to state in quantita-
tive, measurable terms what odirect, timely and
effective access� is. oDirect� might translate into
finding an item in the owning library ninety per-
cent of the time. oTimely� would depend on circu-
lation status; ninety-eight percent of the time, a
user would have an item from interlibrary loan
within ten days. oEffective� might mean that ten
percent of the time a user gets zero hits on a
subject search on the libraryTss OPAC. Unreason-
able? Maybe. But identifying the current levels of
service, setting goals, and monitoring progress is
what performance measures can do.

Some performance measures already exist
and others are needed. Ideally, these can be
developed by teams of technical and public service
librarians who bring their own perspectives and
expertise to an evaluation project.

Conclusion

Technical servicesT discomfort with effective-
ness mirrors public servicesT uneasiness with effi-
ciency. As OPACs are transforming the way people
use libraries, they are also changing the relation-
ship between technical and public services.

In 1986 Gillian McCombs!~ suggested that, in
ten years (that is, by 1996), an evaluation of
library services would become omainly an evalua-
tion of the information provided on the VDT
screen " how much of it is there, how easy it is to
obtain and how quickly.� Perhaps her prediction
is more accurate than some would like to admit.

Our shared mission as technical and public
services librarians is to provide access to informa-
tion. Using performance measures and other eval-
uation techniques, we can find ways to serve our
users more efficiently and more effectively.

Fall 1990"195





References
1. F. W. Lancaster, The Measurement and Evaluation of
Library Services (Washington, D.C.: Information Resources
Press, 1977), 264.
2. Carol A. Mandel, oTrade-offs: Quantifying Quality in Library
Technical Services,� Jowrnal of Academic Librarianship 14
(September 1988); 220.
3. Ronald R. Powell, The Relationship of Library User Studies
to Performance Measures: A Review of the Literature, University
of Illinois Graduate School of Library and Information Science
Occasional Paper, Number 181 (Urbana-Champaign, IIl.: Univer-
sity of Illinois, January 1988), 22-23.
4. Caroline Arms, oThe Technological Context,� in Campus
Strategies for Libraries and Electronic Information, ed. Caro-
line Arms. EDUCOM Strategies Series on Information Technology
(Bedford, Mass.: Digital Press, 1990), 13.
5. Barbara Evans Markuson, oIssues in National Library Net-
work Development: An Overview,� in Key Issues in the Network-
ing Field Today, Proceedings of the Library of Congress Network
Advisory Committee Meeting, May 6-8, 1985. Network Planning
Paper No. 12 (Washington, D.C.: Library of Congress, 1985) 9-32.
6. Pauline A. Cochrane and Karen Markey, oCatalog Use Studies
" Since the Introduction of Online Interactive Catalogs: Impact
on Design for Subject Access,� Library and Information Science
Research 5 (1983): 337-363.

7. Gunnar Knutson, oA Comparison of Online and Card Catalog
Accuracy,� Library Kesources and Technical Services 34
(January 1990): 24-35.

8. Deborah K. Barreau, oUsing Performance Measures to Imple-
ment an Online Catalog,� Library Resources and Technical
Services 32 (October 1988): 312-322.

9. Charles R. Hildreth, oBeyond Boolean: Designing the Next
Generation of Online Catalogs,� Library Trends 35 (Spring
1987): 647-667.

10. Cochrane and Markey, oCatalog Use Studies,� 361.

11. Micheline Hancock, oSubject Searching Behavior at the
Library Catalogue and at the Shelves: Implications for Online
Interactive Catalogs,� Jowrnal of Documentation 43 (December
1987): 303-321.

12. Marcia J. Bates, oRethinking Subject Cataloging in the
Online Environment,� Library Resources & Technical Services
33 (October 1989): 400-412.

13. Thomas A. Peters, oWhen Smart People Fail: An Analysis of
the Transaction Log of an Online Public Access Catalog,� Journal
of Academic Librarianship 15 (November 1989): 267-273.

14. Bates, oRethinking Subject Cataloging,� 400-412.

15. James W. Dwyer, oThe Evolutionary Role of Technical
Services,� Journal of Library Administration 9 (1988): 13-26.
16. Gillian McCombs, oPublic and Technical Services: Dis-
appearing Barriers,� Wilson Library Bulletin 61 (November
1986): 25-28. all

Cc

196"Fall 1990

"Since 1971"

BROADFOOT'S

North Carolina Book Sellers Helping North Carolina Librarians

Broadfoot's of Wendell

6624 Robertson Pond Rd. * Wendell, NC 27591 ¢ (919) 365-6963
The largest selection of North Carolina books anywhere

Free Catalog Cards * Same Day Shipment
Catalog on request

BROADFOOT PUBLISHING COMPANY

Route 4, Box 508-C * Wilmington, NC 28405 ¢ (919) 686-4379
Publishers of historical and genealogical reference sets.

Now reprinting North Carolina Troops " Volumes I-VI
Catalog on request







Performance Measures for
Online Systems

John Ulmschneider and Patrick Mullin

System performance: an overview

Librarians assess a library automation system
by many parameters, such as the richness of its
functionality, the ease of use of its interface, and
its overall purchase and operating cost. One of
the most important criteria is a systemTs perform-
ance. It is not uncommon for librarians to praise
or condemn a system based on performance alone.
But what do library managers mean when they
Speak of osystem performance�? The operform-
ance� of a computer application system can mean
different things to different observers.+? At one
extreme, many librarians treat the functionality
of the applications software as the main criterion
of performance: what does the application soft-

FIGURE 1.

Performance measures for evaluating library
automation systems include something from both
ends of the spectrum. In general, library managers
are not concerned with the capabilities of the
hardware platform used for a system; they are
concerned only with the way the application soft-
ware performs for the user. Librarians also sharp-
ly distinguish responsiveness for interactive oper-
ations, where users query the application system
in real time, from batch operations, where a series
of programs is executed automatically by the
computer. The performance evaluation of a library
application system is assessed through three
parameters concerned primarily with interactive
operations:

SS

Hardware evaluation compared with software evaluation

Hardware evaluation

CPU speed in million instructions per second (MIPS)
Memory speed, caching
disk seek and read time
Number and speed of communications channels

Ware actually do, and how well or thoroughly
does it do those things? At the other extreme, the
Computer industry has developed a number of
berformance measures for computer systems that
distinguish the computing hardwareTs capabilities
from the way application software uses those
Capabilities. Hardware evaluations center on such
Parameters as central processing unit (CPU)
Speed, data retrieval and transfer speed from
disks, memory architecture, and the like. Software
evaluations assess many aspects of the applica-
tionTs operations to build a final picture of its
Performance: the applicationTs use of processing
resources, disk storage and retrieval demands,
instruction mix, response time, memory require-
Ments, and other parameters (Figure 1).

John E. Ulmschneider is the Assistant Director for Library
Systems for the North Carolina State University Libraries.
Patrick J. Mullin is Systems Librarian at the University of
North Carolina at Chapel Hill and Interim Director of the
Triangle Research Libraries Network.

Software evaluation

response time to interactive users

subroutine speed for boolean combinations

disk storage demands for data and work space
number of concurrent users or terminals supported



Response time: how quickly a computer sys-
tem delivers a response to a user query in an
interactive environment;

Application efficiency: what computing
resources (processor cycles, memory, disk space)
are required by software to deliver an adequate
response time; :

Capacity: the volume or amount of work a
system can perform with a given amount of hard-
ware resources, for instance, the number of con-
current searches it can perform.

The relationship between these performance
parameters is not straightforward. For instance,
suppose an application system is very efficient on
machine resources, with clever and tight code
that minimizes the use of memory. Such an appli-
cation might squeeze the most from the machine
resources available to it, but might deliver poor
response time because it does not use enough
memory to speed up, sort, and merge operations.
Or suppose an application delivers very fast

Fall 1990"197





response time, but requires enormous machine
resources to do so. Such an application likely will
be too expensive to maintain.

Because the relationships between perform-
ance parameters are complex, assessments of
library: systems must include data on all three
performance variables. Library managers should
require library application programs to meet cer-
tain minimum standards. For instance, the system
ought not to require a supercomputer to perform
boolean searches and should take less than five
minutes to respond to interactive queries. Mana-
gers do not, however, expect them to show ideal
scores in all three areas.

Each of the three performance measures
lends itself to wide discrepancies in definition and
application. An oefficient� program can end up
using considerably greater memory resources
than an inefficient program if it seeks to minimize
the use of slow mechanical devices(e.g., disk
drives, tapes) by storing volumes of data in main
memory for instant access. On the other hand, a
system providing high capacity might do so only
under ideal conditions, for instance, when all the
online catalog queries are known item searches.
Because of these discrepancies, vendors and
buyers of library systems should define exactly
the nature of performance parameters expected
of a system. In general, efficiency and capacity in
purchase contracts are largely system-dependent
measures, and standards for their performance
pertain only to particular hardware-software
combinations. Librarians have reached a general
consensus, however, on response time: interactive
queries should average no more than three to five
seconds from transmission of a query to receiving
an answer.

Response time

Of the three parameters, response time is
both the most widely applied and least under-
stood measure. For most library managers, re-

... response time is both the
most widely applied and least
understood measure.

sponse time generally means the time between
transmission of a query to the system (by pressing
the return key) and the time when characters
first appear on the screen in response to the
query. Many factors in the application system

affect response time, among them the speed of
disk drives and how much the software uses

198"Fall 1990

them, the memory available of the application
software, and the number of concurrent users on
the system. Most measurements of response time,
however, include three distinct components:

(1). Transmission time: the time required
by the transmission channel to deliver queries
from the terminal to the computer, and data from
the computer to the terminal;

(2) Application response time: The time
required by the application after receiving a query
to process the query and to begin transmitting a
response to the terminal; and

(3) Display time: The time required by the
terminal to display the entire reply from the
computer.

Response time measures usually do not distin-
guish the contribution of each element to the
overall response time, even though users attribute
the entire response time solely to the application
software. Under normal circumstances, compon-
ents 1 and 3 make a negligible contribution to
response time, in the range of milliseconds. In
special circumstances, however, their contribution
may be significant. For instance, in local area
networks, propagation of queries and responses
through several miles of cables, translators,
bridges, and routers can introduce significant
delays. Modem connections may also introduce
considerable delay. Even directly wired connec-
tions can slow down response time if the line
speed is low or the output device is a printer.
Because of these factors, vendors of application
systems usually agree to meet response time cri-
teria only in the context of control over the entire
hardware plant, and they specify dedicated termi-
nals using the fastest and most secure communi-
cations possible.

How is response time actually measured?
Three approaches are possible. In the stopwatch
method, one evaluator enters a query to an online
catalog while a second evaluator times the query
with a stopwatch. The second evaluator starts the
timer at the instant the return key is pressed and
stops the timer at the instant the reply begins to
appear on the screen. Generally, the evaluators
employ a carefully designed script that exercises
most of the searching functions of the system in
simple and more complex searches. The response
time is averaged over all the searches and over a
number of sessions. By using a number of termi-
nals and users simultaneously, the evaluators can
mimic a real online environment with multiple
simultaneous searchers, subjecting the applica-
tion system to a stress test or benchmark test. In
its simplest form, a stress test measures the
responsiveness of a computer system as more and





more of its functions are used simultaneously.
Stress tests usually identify a peak load, or num-
ber of concurrent users, beyond which perform-
ance becomes unacceptable. (See Figure 2.)

The stopwatch method is simple to imple-
ment, cheap, flexible, and expandable. It is also,
by and large, a reliable method if done carefully.
Nonetheless, human reaction time, communica-
tion time, and other variables may affect the final

results.
FIGURE 2.

Typical response time under load revealed by stress test

Note sharp degradation as load increases

o-NwWananne

0 10 20 30 40 50 60 70 80

number of terminals

In the simulation method, a desktop compu-
ter or asimulation program on the library systemTs
computer is equipped with a search script, similar
to that used in the stopwatch method, and is
Connected to the library application program. The
desktop computer or the simulation program
transmits queries and receives replies from the
library application program, and measures very
brecisely the time between transmitting a query
and receiving a reply.

The simulation method requires modest tech-
nical expertise to implement. It retains all the
advantages of the stopwatch method while elimi-
nating variables introduced by human partici-
Pants. Just as with the stopwatch method, evalu-
ators can establish a bank of computers, or a
number of simulation programs, to execute simul-
taneously the prepared search scripts in order to
Subject a computer system to a stress test.

The system monitor method uses software on
the computer system itself to record response
time data on devices and software supported by
the system. Most general-purpose hardware plat-
forms for library automation systems provide
System software to record statistics on the per-
formance of a program in a number of areas: how
much memory it uses, how often it accesses disk
drives, how quickly it answers requests from

terminals, and how much data it sends to them.
Mainframe computers have used such programs
for years to generate billing data, and they have
tuned them to a high degree of accuracy and
comprehensiveness.

Both the stopwatch method and the simula-
tion method rely on searching scripts as models of
anticipated user behavior to gauge actual system
performance. The design of such scripts seldom
reflects user reality. (Recent attempts to base
scripts on statistical and qualitative evaluation of
transaction logs, which are verbatim records of
every query to a library automation system and
every response of the system to the user, are im-
proving the design of such scripts.) Instead, the
scripts are thorough exercises of every aspect of
an application systemTs functionality, with a mix
of commands that cover every possible function
in the system. The application systemTs response
time to such a mixture certainly reveals its
response in carrying out specific functions, but
may not reflect its actual responsiveness in
operational use.

The system monitor approach, in contrast, is
a strictly empirical one. Rather than develop a
model of user behavior, it exhaustively records an
applicationTs responses to actual users and search
loads. Since it is capable of analyzing the data
from hundreds of thousands of commands
entered over extended periods of time, it also
draws upon a much larger universe of experience
that any model can construct. As a result, its
measurements provide a more accurate and com-
plete picture of response time than user models
do and are free of biases resulting from a poorly
designed mix or scheduling of test queries.

The system monitor method has two addi-
tional advantages. First, it is not implemented as
a special test requiring staff participation, special
test computers, or special software. Monitor soft-
ware runs as part of the normal operating en-
vironment and generates reports on terminal
activity and response time as part of the daily
activity log of the system. Second, it provides a
much more detailed and comprehensive picture
of an applicationTs performance, including data
on its use of machine resources as well as response
time. Data provided by system monitor programs
bear importantly on understanding an applica-
tionTs efficiency and throughput, for instance.

System monitors measure response time at
the point where the communications system con-
nects to the computer, so that communication
delays are not included in the response times.
Managers can use this data to evaluate software
performance independent of the communications

Fall 1990"199





plant and to help attribute response time prob-
lems to either software or communications. On
the other hand, the actual response time experi-
enced by a user is the most visible indication of an
online systemTs performance. In general, library
managers supplement system monitor reports
with periodic monitoring of actual user response
time, including stopwatch measurements when
necessary.

Because of their inherent limitations, re-
sponse time measurements that require search
models are best limited to acceptance testing: the
final tests of functionality and performance before
a library accepts a vendorTs system and pays for it.
Managers may use them to strike periodic bench-
marks, but they should recognize that the models
do not usually reflect the actual use or response
time of the system. System monitors should be
used for pre-purchase tests by obtaining data
from operational sites; such data may point to
performance problems before acceptance. The
data may prove particularly useful if the desired
system is installed at a site closely matching the
profile of the purchasing site, with user popula-
tions similar in size, interests, and activity, and
identical hardware resources. Even under these
circumstances, system monitor results should not
be the basis of final acceptance for payment; it is
simply too easy to overlook differences between
one installation and another. After installation,
however, library managers should receive regular
system monitor data that reports actual perform-
ance of the software: response time, computer
resource use, and the like.

Numerous observers have raised two particu-
lar concerns with respect to measuring response
time in library systems.** First, online catalog
searches vary widely in the amount of work they
require of a program. Many searches are direct,
known-item searches, where the program need
only retrieve single records. Other searches may
require locating, performing combinatorial opera-
tions with, and retrieving large sets of records.
Second, the definition of a osearch� is open to
debate. Is a search concluded only when the user
locates the information required? Or should
library managers consider a search equivalent to
a transaction, defined as a single interaction
between user and computer?

Methods that rely on models address these
concerns by using search scripts that exercise
most of the functionality of the application soft-
ware. The scripts include searches that require
considerable processing as well as known-item
searches, and usually provide for multi-step
searches (e.g., perusing an index list, selecting a

200"Fall 1990

retrieval set, and then narrowing the set to find
the desired item). The system monitor method, on
the other hand, cannot distinguish difficult from
simple searches; it measures the response time
for individual transactions, regardless of their
type. System monitor methods compensate for
this limitation by processing a very large trans-
action volume, which ultimately produces a statis-
tically valid judgment of normal response time.

Efficiency and Capacity

The efficiency and capacity of library applica-
tions software are affected by a great many factors
in the library system taken as a whole: the hard-
ware platform, the programming language used to
implement the system, the architecture of the
application software, data storage techniques,
and even the operation of unrelated software.
Assessing the efficiency and capacity of a program
requires quantitative data, an intimate knowledge
of the hardware platform, and extensive experi-
ence with the general capabilities of software in a
given hardware environment.

A program is said to be efficient when it
performs work with optimal use of hardware
resources. Inefficient programs are obvious to
system managers; they require prodigious resour-
ces to perform simple tasks. Efficient programs
are not so easily pinpointed. Generally only close
examination of the actual code or architecture of
an efficient program reveals areas for improve-
ment (or admiration). For example, a program-
mer can improve the efficiency of a program by
decreasing disk drive access, memory resource
use, or CPU time to perform a given task. Effi-
ciency judgments extend to suites of applications
programs as well as to single programs, since
library applications often consist of a number of
programs performing different tasks in concert.
The overall architecture of a system can be con-
sidered efficient or inefficient, depending on how
it uses system resources.

Efficiency bears directly on capacity. Capacity
measures the amount of work a computer system
can perform given a certain mix of machine re-
sources and programs. Capacity relates to the
computer system as a whole, not just to a given
applications program, since both available hard-
ware resources and a programTs use of them
determines the amount of work possible. Efficient
programs make better use of hardware resources.
A computer running efficient programs can per-
form more work in any given machine configura-
tion than one with inefficient ones. For instance,
efficient programs may permit the system to





handle up to twenty concurrent users, while ineffi-
cient programs may reduce this capacity to only
ten or twelve. The types of work performed on a
computer system also affect its capacity. Certain
users or activities require more hardware resour-
ces than others and can affect overall capacity
significantly. For example, catalogers and other
technical support users editing the database
usually require a great deal more CPU support,
disk access, and the like than someone merely
Searching the catalog.

The first step in measuring capacity is to
determine the amount and kinds of activities in
the computer system at any given time as well as
the various resource consumption and perform-
ance measurements of the system while engaged
in those activities. System monitor programs pro-
vide comprehensive data on how a computer
system is actually used throughout the day. Once
system monitors are in place to measure activity,
resource consumption, and response time, the
systems manager builds a resource use profile by
analyzing data from days or months of use. The
profile indicates peak resource consumption
periods, overall resource use, and the resources
consumed by particular application programs. A
system is said to reach capacity when either of
two events occurs:

(1) Consumption of hardware resources
reaches defined maximum limits. The defined
Maximum resource use of hardware platforms,
beyond which additional resources are recom-
mended, varies from manufacturer to manufac-
turer. Most manufacturers consider a CPU satur-
ated, for instance, at about eighty-five percent
average use. Disk storage reaches a maximum
when growth space is not sufficient for short-
term growth.

(2) Response time for online users degrades
below a defined maximum. When response time
degrades above an average of five seconds for
most transactions, for instance, the computer
system no longer has capacity for additional users.

Systems managers and librarians employ re-
source use profiles precisely to avoid reaching
capacity on a computer system. By monitoring
system resource consumption through frequent
profiles, managers can model future system
demand and project resource requirements neces-
Sary to maintain adequate response time, disk
Storage, and other resources.

A Management Example: Performance
Measures at TRLN

The Triangle Research Libraries Network
(TRLN) is a cooperative library automation pro-

ject of Duke University in Durham, North Carolina
State University (NCSU) in Raleigh, and the Uni-
versity of North Carolina at Chapel Hill (UNC-
CH). TRLN has focused on the Bibliographic Infor-
mation System (BIS) as the core and first module
of an integrated library system. The circulation
control module is currently undergoing beta test-
ing at NCSU. A vendor-supplied acquisitions and
serials control system will be implemented by all
three institutions. The TRLN system is a distrib-
uted system. Tandem computers located on each
campus support the catalog for that campus.
TRLNTs unique software allows library patrons to
search any one of the catalogs in the network or
to search multiple catalogs simultaneously, dis-

FIGURE 3.
EP e ws ane See a a
Summary Terminal Use And Response Time
Report By Terminal

DATE OF THIS REPORT: 03/06/90 RUN TIME: 04:10:56 AM

AVERAGE
TERMINAL-NAME REP-DATE TOTTRAN RESPONSE

$$

$ATPO #VAXI 03/05/90 219.00 4.56
$ATPO #VAXK2 03/05/90 674.00 4.88
$ATPO #VAX3 03/05/90 92.00 2.67
$ATPO #VAX4 03/05/90 666.00 3.65
$ATP1 #DCAIL 03/05/90 1089.00 4.20
$ATP1 #DCA2 03/05/90 768.00 3.64
$ATP1 #VAX5 03/05/90 290.00 4.00
$ATP1 #VAX6 03/05/90 481.00 3.60
$ATP2 #DCA3 03/05/90 623.00 3.83
$ATP2 #DCA4 03/05/90 961.00 4.19
$ATP2 #DCA5 03/05/90 164.00 3.30
$ATP2 #DCA6 03/05/90 534.00 3.60
$ATP3 #DCAL10 03/05/90 423.00 3.60
$ATP3 #DCAT 03/05/90 1089.00 3.84
$ATP3 #DCA8 03/05/90 717.00 4.53
$ATP3 #DCA9 03/05/90 263.00 4.11
$ATP4 #DCAIL1 03/05/90 773.00 4.69
$ATP4 #DCA12 03/05/90 848.00 4.45
$BSCTR33 #BASS1 03/05/90 307.00 4.38
$BSCTR33 #BASS3 03/05/90 76.00 8.07
$BSCTR33 #CIRC1 03/05/90 65.00 3.24
$BSCTR33 #HUM1 ~03/05/90 367.00 3.51
$BSCTR33 #HUM2 03/05/90 100.00 3.72
$BSCTR33 #HUM3 03/05/90 170.00 4.43
$BSCTR34 #CHEM1 03/05/90 126.00 4.18
$BSCTR36 #PUBB1 03/05/90 786.00 3.67
$BSCTR36 #PUBB10 03/05/90 811.00 3.59
$BSCTR36 #PUBB11 03/05/90 786.00 3.71
$BSCTR36 #PUBB12 03/05/90 520.00 4.30
$BSCTR36 #PUBB2 03/05/90 302.00 4.76
$BSCTR36 #PUBB3 03/05/90 529.00 4.02
$BSCTR36 #PUBB4 03/05/90 333.00 3.83
$BSCTR36 #PUBB5 03/05/90 1286.00 3.84
$BSCTR36 #PUBB6 03/05/90 861.00 4.07

a SS RE SS ME ES
Response time average per transaction for all terminals:
3.80 seconds
Total number of transactions for all terminals: 32,644.00
NOTE: A transaction is equal to reading a command and
outputting a response to the command.

Fall 1990"201





playing the results as a merged retrieval set.

The three TRLN universities use two system
monitor tools available on their systems to gener-
ate and analyze performance data. The system
resource and performance monitor software
MEASURE, available as part of the Tandem oper-
ating system software, collects detailed data on
terminal response time, application resource use,
and other items of interest (e.g., communication
line activity, disk drive accesses). ENLIGHTEN, a
third-party product from Software Professionals,
Inc., can be used with MEASURE-created files to
construct graphic representations of the data
either online dynamically or in print format.

The TRLN libraries use MEASURE to collect
response time data, to analyze software efficiency
and pinpoint areas for improvement, and for
capacity modeling and projection. Capacity
modeling and efficiency analysis requires the
collection and analysis of enormous volumes of
data, usually on a great many hardware and
software parameters simultaneously. Because of
the volume, this kind of data is collected only
periodically, and then through well-defined sam-
ples of system activity throughout the day (see
below). On the other hand, data on the number of
transactions on the system and the average
response time for those transactions by port or
terminal (Figure 3) and by time of day (Figure 4)
is monitored constantly. The transaction response
time reported by MEASURE is not the user-appar-
ent response time. MEASURE calculates only the
response time from the moment a command is
received by the Tandem system to the moment a
response is sent from the Tandem to the user
device. It does not include communication time or
display time.

This basic transaction and response time
information is used in a variety of ways at the
three universities: to prepare reports and track
trends; to justify, plan, and budget equipment
purchases; and to analyze the workload on and
balance of the composite system. Each of the
three universities reports the average number of
daily transactions on its system and the average
response time in the monthly TRLN Project Status
Report. Despite different hardware configura-
tions, the data provides some indication of the
relative use of the three Tandem-based systems.
For instance, in the fall of 1989, each TRLN insti-
tution experienced a sharp increase in the level of
transactions, some by nearly forty percent. Other
statistics, collected within the libraries, corrobor-
ated this increased use of library services. Circula-
tion, for instance, increased nearly thirty percent
at NCSU.

202"Fall 1990

FIGURE 4.

Summary Terminal Use And Response Time
Report By Time of Day

DATE OF THIS REPORT: 03/06/90 RUN TIME: 04:11:54 AM

AVERAGE TOTAL
FROM-TIME TO-TIME RESPONSE " TRANSACTIONS
08:00:00 AM 08:30:00 AM 2.38 413.00
08:30:00 AM 09:00:00 AM 2.74 784.00
09:00:00 AM 09:30:00 AM 2.94 764.00
09:30:00 AM 10:00:00 AM 3.13 834.00
10:00:00 AM 10:30:00 AM 3.73 1143.00
10:30:00 AM 11:00:00 AM 4.29 1528.00
11:00:00 AM 11:30:00 AM 5.05 1665.00
11:30:00 AM 12:00:00 PM 7.15 1777.00
12:00:00 PM 12:30:00 PM 4.80 1649.00
12:30:00 PM 01:00:00 PM 3.71 1112.00
01:00:00 PM 01:30:00 PM 3.75 1127.00
01:30:00 PM 02:00:00 PM 3.83 1264.00
02:00:00 PM 02:30:00 PM 4.58 1580.00
02:30:00 PM 03:00:00 PM 4.27 1390.00
03:00:00 PM 03:30:00 PM 3.81 1192.00
03:30:00 PM 04:00:00 PM 3.98 1430.00
04:00:00 PM 04:30:00 PM 4.07 1227.00
04:30:00 PM 05:00:00 PM 4.18 1383.00
05:00:00 PM 05:30:00 PM 2.96 950.00
05:30:00 PM 06:00:00 PM 2.93 788.00
06:00:00 PM 06:30:00 PM 3.07 860.00
06:30:00 PM 07:00:00 PM 2.73 586.00
07:00:00 PM 07:30:00 PM 2.93 582.00
07:30:00 PM 08:00:00 PM 3.51 1034.00
08:00:00 PM 08:30:00 PM 3.41 1171.00
08:30:00 PM 09:00:00 PM 3.75 978.00
09:00:00 PM 09:30:00 PM 3.27 842.00
09:30:00 PM 10:00:00 PM 2.62 552.00
10:00:00 PM 10:30:00 PM 2.61 832.00
10:30:00 PM 11:00:00 PM 2.96 452.00
11:00:00 PM 11:30:00 PM 1.92 202.00

Response time average per transaction for all terminals:
3.80 seconds

Total number of transactions for all terminals: 32,644.00

NOTE: A transaction is equal to reading a command and
outputting a response to the command.

On each campus, this basic transaction and
response time information is reported to the
library administration, library staff, and library
users (e.g., Figure 5). It can be used to demon-
strate progress or to warn of potential problems.
In 1987, for instance, TRLN began to re-examine
its software programs, rewriting many of them to
increase the efficiency of the system. The resulting
thirty-five percent increase in efficiency provided
sufficient processing reserve to absorb the sharp
increase in transaction levels in the fall of 1989
and still maintain oacceptable� response time. On
the campus of UNC-CH, the data has been used to
monitor the need for additional terminals in the
House Undergraduate Library based upon the
average number of transactions per day per port





or terminal. As a result, in the past two years, the
number of available terminals in that location has
been doubled.

The records of terminal activity in a particu-
lar area also can be used to question the need for
a terminal in areas of light or low use. For
instance, at UNC-CH, a terminal in one depart-
Ment generated only ninety-nine commands in a
two-week period during February 1990. On the
basis of this data alone, it would seem that a term-
inal in this area was not justified. Such data
Should mandate a review of justifications for
Maintaining a terminal in little-used locations.

The daily statistics can be used to schedule
batch jobs which contend with online functions
for resources. Through a semester, TRLN staff
Monitor busy times and busy days of the week. As
might be expected, activity declines sharply late
Friday afternoon. Tuesdays, however, are as busy
as or busier than Mondays. TRLN staff generally
Schedule extensive processing runs during low-
use periods.

Terminal activity levels also can help identify
physical conditions that lead to heavy use of
terminals. In the cluster area of UNC-CHTs Davis
Library, for instance, one terminal is more heavily
used than any other. Two characteristics distin-
Suish this terminal: (1) it has more room for users
to set materials down on either side of the termi-
nal than do other terminals in the cluster, and (2)
there is ample opersonal� space because it is
Separated from other terminals, so that no other
terminals (and hence no other users) are close by.

In planning and budgeting for the normal
Srowth of systems, the pattern of current use can

be called upon to project future needs. The in-
crease in transaction levels needs to be closely
monitored to determine the need for additional
terminals and the need for additional processor
capacity. This growth in transaction levels, cou-
pled with the increase in data base coverage
through retrospective conversion and new ser-
vices such as the implementation of the TRLN
Circulation Control Subsystem, must all be fac-
tored into planning the annual budget allocations
and biennial budget proposals.

Dial access is one area where this data should
be carefully monitored. A frequent question about
remote access is: how much is enough? The sim-
plest answer is that there is no single answer; it is
always changing. The question should be how to
monitor its use and to plan for its growth. Unfor-
tunately, managers cannot know how many so-
called oinvisible users� exist, and these users as a
rule do not inform managers about problems in
accessing the online catalog. Even if users fre-
quently encounter busy signals when they try to
access the catalog, library managers may never
find out that their remote access ports are con-
stantly busy. In cases where the majority of remote
access comes from links into existing campus net-
works, there may be no easy method to determine
how often users are denied access to the catalog
because its network slots are filled. (Interestingly,
while in-house use increased dramatically at UNC-
CH in fall 1989, dial access use showed no corre-
sponding increase.)

MEASURE and ENLIGHTEN are used to moni-
tor processor loads, memory use, disk activity, and
other processes. With these tools, the systems

FIGURE 5.

Number of System Transactions

aa Daily Average

se go RC ey RC) yw ra yp? oe? of Ro s�

MONTH

1989 1988 1990
Collected: Monday " Friday

Transaction Response Time
Daily Average

SECONDS

tho
yw qe »* we RC yp� rr »� oe? Oo� a 9°
MONTH

1989 1988 1990
Collected: Monday " Friday

Fall 1990"203





manager can generate graphical displays showing
the use of processor capacity and memory resour-
ces, and the distribution and timing of disk use
among multiple disk drives. In a multi-processor
configuration, the tools can show how the load is
distributed over the processors (e.g., where the
load is heaviest and where the load is lightest). All
of this information is necessary to obalance� or
otune� the system load across the available pro-
cessors and disks. System tuning directly impacts
the efficiency of the system and the user-apparent
response time. As hardware is added or new
programs installed, the resource balance must be
re-examined and the system tuned to preserve
optimal use of resources.

BIS is implemented as multiple copies of a
suite of programs. One advantage to this approach
is to provide redundancy in the event of a system
problem or crash. NCSU, for instance, runs six
copies of the BIS software. If a problem occurs on
one copy, it affects only one-sixth of the terminals.
The terminals are distributed across the six sys-
tems based upon the load level, location, type of
activity, and other factors. As new copies of the
software are added to the system, the systems
manager can use daily transaction data to redis-
tribute terminals and maintain optimal trans-
action balance among the copies.

In addition to load balancing, these perform-
ance measurement tools can be used with the
individual programs to gauge their relative effi-
ciency and to identify where improvements can be
made. In one project at TRLN, MEASURE was
used to calculate the CPU time in milliseconds per
transaction for each program.® A program could
then be selected for closer scrutiny and MEASURE
was again used to identify, within the program
code, paragraphs that used a large percentage of
CPU time. At this level of detail, problems general-
ly become fairly easy to recognize and correct.
TRLN used such procedures to achieve a thirty-
five percent reduction in CPU use.

Conclusion

Librarians make use of a variety of tools and
techniques to assess the performance of library
systems. The different stages in the life cycle of a
system require different performance measures
that deliver data appropriate to the decisions
required for each stage. During the initial acquisi-
tion of a system, performance measures that
deliver benchmark and peak load data, such as
simulation and stopwatch response time mea-
sures, are crucial to deciding the suitability of a
product to a given libraryTs environment, and they
figure importantly in developing the initial hard-

204"Fall 1990

ware configuration for installation. Throughout
the production life of a system, system monitor
data provides regular assessments of the capacity,
response time, and utilization growth of the sys-
tem. In particular, library managers closely moni-
tor response time, because it remains the single
most important determinant of user satisfaction.
At the end of the life cycle, system monitor data
forms a significant part of the management data
required for functional design, performance speci-
fications, and hardware configuration for migra-
tion to a new library system.

Performance measurement tools provide
basic management data to support a variety of
decision points during the production life of a sys-
tem. Initial purchase, system tuning, terminal
allocation, load balancing, optimal timing for
resource-intensive processing, and system migra-
tion all depend upon comprehensive data con-
cerning the kinds of activities and their resource
demands on the system. It behooves library mana-
gers to develop an understanding of the nature
and use of performance measures, to become
familiar with different performance measures,
and to ensure that their systems provide the data
they require for system management decisions.

References
1. oSpecial Section: Measuring System Performance,� Informa-
tion Technology and Libraries 7 (June 1988): 173-97.
2. Jerry V. Caswell, oPerformance evaluation of computerized
library systems,� in Advances in Library Automation and Net-
working, vol. 2, ed. Joe A. Hewitt (Greenwich, Conn.: JAI Press,
1988).
3. Clifford A. Lynch, oResponse time measurement and per-
formance analysis in public access information retrieval sys-
tems,� Information Technology and Libraries 7 (June 1988):
177-83.
4. Robert N. Bland, oEvaluating the performance of the online
public access catalog: a redefinition of basic measures,� North
Carolina Libraries vol. 47 (Fall 1989): 168-73.
5. Gwyneth M. Duncan, oUsing MEASURE to identify perform-
ance bugs in COBOL programs,� Tandem UsersT Journal 9

(Noy./Dec. 1988): 13. n|
ch



oAnyone can learn just
about anything they want
to know by using the
library. ItTs the means for
completing our quest.�

"Gov. Jim Martin speaking August 1990 to about 150
people at a regional GovernorTs Conference on Library
and Information Services at the Public Library of
Charlotte and Mecklenburg County in uptown Charlotte.










Theory Into Practice

Patricia M. Kelley

Performance Measures: The Theory

Rightfully so, an academic librarian who is
Considering the implementation of performance
Measures will question whether the usefulness
justifies the time and effort required. After all,
donTt we already know whether or not our librar-
ies are doing the best job possible with the resour-
ces at our disposal? The short experience with
Performance measures of Gelman Library at The
George Washington University indicates that the
time and effort are well spent and that measures
help to provide-objective evidence to support or
refute our intuitive professional evaluations of
how well we are serving our community.

The data we have gathered offer few sur-
Prises. Like librarians everywhere, we have a fairly
good sense of where our successes and difficulties
lie. Our dilemma is that each person is familiar
With a few pieces of a puzzle that portrays a
Complex service organization. The shapes of our
Puzzle pieces change continually, however, with
the introduction of new technologies, the rise and
fall of budget allocations, turnover of staff, pro-
grammatic changes in our parent institution, and
resource sharing opportunities. As we work with
�,�ach other and with the faculty and administra-
tors outside the library to ensure that the puzzle
Pieces continue to fit together properly and that
the picture they form is pleasing to this particular
university, we find that we need to describe library
Operations in concrete terms. We need to describe
objectively the state of the library to ensure clarity
of communication and to give credibility to the
assessment we make about how well the library is
Serving students and faculty. Performance mea-
Sures provide that description. They can be used
to explain what the library is achieving and what
resources it needs. When compared with stan-
dards, they describe how well the library is per-
forming. And when compared with organizational
8oals, they tell us how well we are serving our
target clientele.

Patricia M. Kelley is Assistant University Librarian for Pro-
Srams and Services at The George Washington University in

Washington, D.C.

In reality, how can an academic library insti-
tute performance measures? This article describes
why and how the Gelman Library initiated a pro-
gram of performance measures, how we measured
the accessibility of collections and services, and
my assessment of the experience.

Performance Measures: The Practice

Why did the library institute performance
measures?

Use and user studies have been conducted in
Gelman Library for a variety of purposes for
years, but the decision to create an ongoing pro-
gram of performance measures emerged as a
result of our formal planning process in 1986.
Believing that the library needs to be a dynamic,
change-oriented service organization, the univer-
sity librarian introduced a strategic planning pro-
cess. One critical element of this process is the
environmental scan, which requires that we
understand both our external and internal work-
ing environment. In part, a management informa-
tion system helps to describe our internal library
environment. The administrators in this library
conceive of performance measures as part of that
management information system. As we change
policies and reallocate resources in order to
accomplish our strategic goals, data from per-
formance measures will reflect the results" both
intentional and unintentional " of many of our
planned changes.

Realizing that we could not allocate the neces-
sary staff to conduct performance measures for
all activities at once, we categorized activities and
assigned priorities. Then we scheduled the imple-
mentation of measures in each category to be
accomplished over a five year period. Categories
of activities were designated as follows: accessi-
bility of services and collections; collection quality;
human resources; facilities; user education; library
as gateway; and planning process. Although we
roughly grouped library activities in these cate-
gories at the time we established the timeline,
refinements are made as we address each one.

Fall 1990"205





For example, we defined accessibility of services
and collections to include in-house collection
availability, turnaround time on interlibrary loan
transactions and on searches for unfound items,
utilization of equipment, and length of lines at
service points.

Except for the accessibility category, our
timeline relates to major events in the predictable
future. For example, collection quality assessment
began during the year when the Library played its
first significant role in academic program review.
Accessibility studies were selected as our first
category because we had specific questions we
wanted to answer, and because we wanted to
learn more about the pattern of use by our
primary user group as compared to that of visitors
who make up a significant proportion of our user
community. To understand our concern and why
we believe that performance measures provide
much better management information than does
our professional judgment by itself, some infor-
mation about this library will be helpful.

Gelman Library is the main university library
on the main campus of The George Washington
University. Our primary user groups, and there-
fore our target audience for collections and ser-
vices, are the faculty, students, and staff of this
University and, to a slightly lesser degree, the
students and faculty of other member universities
of the Washington Research Library Consortium.
However, the campus is located in downtown
Washington, D.C., adjacent to the Federal office
area and easily accessible to more than seven
hundred consulting firms and law firms. Unlike
many urban university libraries, Gelman is avail-
able for on-site use by any member of the public
who presents current photo identification at our
registration desk. As a result, researchers from
government agencies and private firms form a
significant non-target clientele. Because the
majority of our students are graduate students
and most of them are employed in local govern-
ment agencies or private firms, the ovisitors� are
not readily distinguishable from the students and
faculty. This inability to differentiate at a glance
complicates our ability to make informal assess-
ments of how well we serve our primary clientele.
Just to make things really challenging, we share a
building with a number of administrative and
academic offices and classrooms. Unfortunately,
all of these non-library activities are accessible
only through the LibraryTs main entrance.

Selection and implementation of performance
measures

The first and most essential step in establish-
ing a performance measures program is educating

206"Fall 1990

the staff. Unless the staff understands and buys
into the process, measurement of library activity
is likely to be viewed negatively. We are so accus-
tomed to thinking in terms of goals and standards

The first and most essential .
step in establishing a perform-
ance measures program is
educating the staff.

and so accustomed to one-shot surveys, that it is
difficult to accept the concept that measurement
done consistently over time and done indepen-
dently of standards will be valuable. The educa-
tional effort in Gelman Library had several com-
ponents. One was an addendum to the strategic
planning document which described the measure-
ment and assessment model we would implement.
That model defines measurement " as distinct
from assessment " and lists the components of
the process that pertain to each. Every staff
member received a copy of the plan, including the
addendum, during a staff gathering in the fall
when the university librarian explained the
reasoning behind the various provisions of the
plan. In addition, articles about measurement
appeared in our weekly staff newsletter. The most
concentrated educational activity was the
management retreat, which was attended by all
administrators and heads of library departments
and units. This one-day retreat focused on per-
formance measures, with a short session on sta-
tistical reports that we file with local, regional,
and national bodies. Because these reports tend
to include primarily input data (budget, number
of staff, and other resources) and very little
output data (performance data such as reference
statistics, loan transactions, etc.), dealing with
the two topics in one retreat helped to clarify how
performance measures differ from the data
libraries traditionally collect. Aided by a specialist
in educational measurement, we used the retreat
as a workshop to learn the concepts and some
techniques of measurement. As a result of the
retreat, key staff members were able to imagine
the usefulness of measurement in their own
decision-making.

During the 1987/88 academic year, I identi-
fied the kinds of studies that would tell us how
successfully users actually locate books in our
library, the length of lines at service desks, and
whether or not we have sufficient equipment to
provide access to the collections. Because we
defined accessibility in its broadest terms, the





equipment usage we studied included our catalog
(which was on compact disc), indexing and ab-
stracting services on compact disc, microform
readers and printers, photocopy machines, and
elevators. In the beginning, I drew on the pub-
lished literature, experience, and a somewhat
similar study conducted by Tracy Casorso in
Gelman Library two years previously. Then I
worked with a number of individuals and groups
to design, plan, and implement the studies.

I sought two sources of expertise. One was
statistical; the other was operational. A professor
of management science and psychology provided
the statistical and research design assistance. He
offered invaluable advice about sampling, validity
and other technical concerns. Most of all, however,
he gave down-to-earth practical advice. He re-
assured me that studies done for purposes of
management decision-making are quite different
from experimental or laboratory research, where
Conditions can be controlled. Because our re-
search is done in the real world with real library
users (who may or may not be cooperative),
where all kinds of events beyond our control
influence human behavior, we need to note the
events that may affect the results of our study.
But those events do not invalidate the study. For
example, if an exam in a large music class is
scheduled for the day after our randomly chosen
Survey day, the use of audio equipment in the
Media Resources Unit will be abnormally high.
That will not be a otypical� day in that unit, but it
isnTt atypical either, so we note the cause of the
high volume of use and include the data in the
Study.

The other source of expertise was the Gelman
Library staff, the people who intuitively judge
demand for services and adjust staffing levels
accordingly. Not only did they provide a list of
questions they hoped our performance measures
would address, but they also gave thoughtful con-
Sideration to the selection of sampling time
Periods, design of data collection forms, and
logistics. Because staff in this library work
together in groups continually, it was easy to fit
planning of performance measures into regular
meetings of librarians, mid-level managers and
supervisors, heads of service units, and so forth.

We planned data collection with the convic-
tion that there is no such thing as a otypical week�
in our library. There are, however, typical patterns
within a week. For example, the usage patterns
seem to be very similar on Monday through Thurs-
day evenings. We identified nine such periods.
Then we randomly selected nineteen dates during
the fall 1988 semester for data collection, ensuring

that we had sufficient representation of every
survey period so that our survey samples would
yield meaningful data. During the following spring
semester we started a little earlier and were able
to survey on twenty-two days.

In preparation for the surveys, we hired staff
who would conduct the observations. We also
developed and tested data collection forms for
each study. One form, to be given to people using
the serials lists, asked the users to note which
journals they were seeking and whether or not
they found the journal. Another asked users of
the libraryTs catalog to give the same information
about the books they sought. Another set of forms
was used by observers who walked through the
library noting which machines were in use, which
were out of order, how many staff members were
working at specific desks, how many people were
being assisted by those staff members, and how
many people were waiting. Turnaround time on
interlibrary loan requests, book search requests,
and waiting time for appointment services could
be derived from information noted on the normal
request forms. Separate forms were designed for
data collection at service desks, although these
tended to be expansions on the data forms the
staff routinely use.

Because we wanted to distinguish current
GW faculty, staff, and students from alumni (a
significant user group), from consortium faculty
and students, and from all other researchers, we
purchased labels in four colors to issue to library
users as they entered the building. The color of
the label indicated the individual's user category
" GW user, consortium member, alumnus, or
unaffiliated researcher. As individuals requested
assistance at service desks or were observed using
collections, library staff who collected were able
to record transactions by category of user without
having to ask each person about his/her affilia-
tion.

On survey days the entrance staff, with assist-
ance from additional staff during peak periods,
handed each entrant a colored label and asked
him/her to wear the label in order to help us
conduct library surveys. Meeting some resistance
by users who did not want to wear the label, on
the second day we began offering a letter explain-
ing the purpose of our surveys and the importance
of wearing the labels. Over time we found that a
large sign explaining the meaning of the various
colors of labels answered most usersT questions.
As the survey progressed, people who were going
to non-library portions of the building or just to
study rooms declined the labels. But others wore
the dots or presented them upon request as they

Fall 1990"207





sought assistance at service desks or when the
observers made their rounds to record use of
equipment and length of lines.

Most of our studies did not require conscious
participation by library users. Library staff col-
lected data through observation or as a routine
activity during normal transactions at service
desks. A user was conscious of being studied only
if he/she failed to wear the colored label and,
therefore, was asked to show the label to the data
collector.

The only data collection that required con-
scious user participation was the collection use
study, in which we asked people to note the books
and journals they sought and whether or not they
found the items. UsersT willingness to fill out (or
submit) the worksheets varied from modest to
poor. As a result, while we received sufficient
response to draw general conclusions about the
causes of user failure to find the materials they
sought, the decline in response rate over the
course of the semester prevented us from answer-
ing some of our more specific questions. For
example, we had wanted to know whether the
causes of user failure varied by time of semester.
The number of survey responses dropped as the
semester progressed, leaving us with insufficient
data to analyze variation by time of semester.

Usefulness of the measures
In this initial set of studies, we collected a

great deal of baseline data that was useful in
documenting deman4 for specific services by cate-
gory of clientele. Many of our assumptions about
usage patterns were confirmed, and some of our
assumptions about our shortcomings were dis-
proved. For example, we had believed that we had
long lines waiting at photocopy machines and
that unaffiliated users were tying up our ABI
Inform stations. Neither of these turned out to be
true. As a result, we decided not to purchase addi-
tional copiers and postponed implementation of
measures to restrict use of the selected reference
tools on compact disc. The impact of our mal-
functioning circulation computer system and the
crowded conditions of our stacks could be
described objectively and quantitatively as a result
of the collections use study. We could state with
confidence that we have sufficient access tools of
various types to meet usersT needs, except at peak
demand periods, and could identify the times and
places where we most feel the impact of unaffil-
iated users. As a result, we have changed some
service hours, changed some policies and prac-
tices (such as providing priority service to GW
members who present identification at the Refer-
ence Desk), and identified improvements we

208"Fall 1990

would like to make if the opportunities arise. But
most importantly, staff members who participated
in data collection have a new awareness of the
usefulness of performance data for decision
making. Finding that the data disproved some of
our assumptions provided a good demonstration
of the need to base decisions on hard data mixed
with experience and intuition.

In the 1990/91 academic year we will repeat
some of the accessibility studies to determine
whether the deselection process (which loosened
up space in some stacks areas), a new circulation
system, staffing reallocations, and some policy
changes have had the desired effects. When we do
that, the full usefulness of performance measures
to record changes over time will be demonstrated.
Meanwhile, we have proceeded with planning and
implementing performance measures for other
library activities. " a

top publishers

great personal service
comparative prices

high fill rate & fast delivery
full processing

for more information please call.

ROBERT MOSER

1-800-223-3251

Representing quality adult and juvenile publishers







The Evaluation of Service Activities
in Academic Libraries and Criteria
for Evaluation Selected by
Administrators of Those Libraries

Sally Ann Strickler

Administrators of academic libraries encoun-
ter financial challenges today as during no other
Period in recent years. Institutional leaders
demand accountability for costly materials,
Personnel, and services expenditures. Library
administrators have the significant responsibility
of carrying out academic library functions with
Inflated costs and decreased funding. Libraries
are being challenged to prove their worth. Effec-
tive allocation and use of resources becomes a
Necessity.

The Association of Research Libraries (ARL)
Office of Management Services (OMS) suggests
that libraries must assess library services on either
an ongoing or periodic basis. The Standards for
College Libraries and Standards Sor University
Libraries, prepared by the Association of College
and Research Libraries (ACRL), both require eval-
Uation of the library program. Each of the six
Tegional accrediting commissions states that ser-
Vices of the library should be regularly evaluated
to determine the libraryTs effectiveness. Mindful of
the needs of administrators of academic libraries,
ACRL has prepared a manual of output measures
for academic libraries which will assist librarians
�"�M measuring the impact, efficiency, and effective-
Ness of academic library activities.

The difficulty in assessing library service
Programs lies in the fact that available assess-
Ments do not measure the quality of service and
Must be cautiously interpreted. The literature
Teveals a great concern regarding the topic and is
Teplete with research on oevaluation of library
Services,� omeasurement of library services,� oqual-
ity values of library service,� and oindices of effec-
tiveness of library public services.� None of the
Tesearch, however, has fulfilled the assistance

Promised, that is, to produce suitable, serviceable

Sally Ann Strickler is head of the Department of Library
Public Services for Western Kentucky University Libraries in
Bowling Green, KY.

guidelines for the qualitative assessment of the
effectiveness of academic library services to be
used for measurement of service, effective plan-
ning, and assessment of user needs.

Whether librarians want to evaluate their
institutions or not, service agencies are currently
on trial in a culture that is developing a deep
skepticism, subjecting academic organizations to
scrutiny as never before. Librarians will need to

... Service agencies are
currently on trial in a culture
that is developing a deep
skepticism, subjecting
academic organizations to
scrutiny as never before.

come forward with evaluative data to support
their case, or fiscal authorities will assume that
evaluation. Library directors must look for criteria
other than quantitative or financial to determine
the success of their institutions. What are these
criteria? How do contemporary library directors,
faced with a complex, dynamic organization,
ensure that these criteria are met?

The following questions reflecting my interest
in this dilemma formed the major purposes of my
recent research project. I sought to determine:

1. Which library services are now being
evaluated?

2. How extensive is the current involvement
of academic libraries in evaluation?

3. What are the attitudes of academic library
administrators toward the evaluation of library
services?

4. What criteria do academic library admin-
istrators consider important for evaluating the
effectiveness of library services?

Fall 1990"209





5. What are the relationships among atti-
tudes toward evaluation, the perceived impor-
tance of evaluative criteria, and actual participa-
tion in the evaluative process?

6. What are the relationships between the
organizational and administrative characteristics
of the academic libraries and the levels of partici-
pation in evaluation? How do these characteristics
relate to the attitudes of academic library admin-
istrators toward evaluation?

The research survey involved one instrument
designed by the researcher. The items composing
the questionnaire were based upon the literature
review for this study to obtain information
relating to the following major areas of research
concern:

1. Management information " Included
were questions designed to determine the extent
to which libraries evaluate services, what services
are being evaluated, and what types of evaluations
are being used.

2. Perception of evaluation information "
Included were statements describing evaluation
of academic library services placed on a Likert-
type scale to allow the respondent to indicate
agreement or disagreement with the statements.

3. Evaluation guidelines information "
Included were factors considered by library ad-
ministrators to be important as meaningful cri-
teria for evaluating the effectiveness of academic
library services. A Likert-type format enabled the
respondent to indicate the degree of importance
of each factor.

4. General information " Included were
questions relating to the distinguishing character-
istics of academic libraries which do or do not
evaluate library services (e.g., size of collection,
size of library staff, size of student population,
public, independent, or church-related institu-
tion). This information was used to define sub-
groups for comparison and analysis.

The population from which the sample for
the study was drawn consisted of the chief admin-
istrative officers of 734 academic libraries whose
institutions are accredited by the Southern Asso-
ciation of Colleges and Schools (SACS) and are
listed in the member directory of the association.
A random sample of 417 was selected from this
group using a computer-generated table of
random numbers.

A pilot study was used to test the preliminary
draft of the instrument. Revised questionnaires
were sent to each of the chief administrative
officers in the random sample of SACS institution
libraries in September 1985. From the sample
population of 417, 348 responses were received

210"Fall 1990

for a return rate of 83.45 percent. Of the 348
responses, 325 were usable for analysis, a valid
response rate of 77.94 percent.

Several aspects stand out as important in the
results of this study. First, as indicated in Figure

FIGURE 1.

Library Services Evaluated Most and Least Regularly,
by Library Services Area

O Designates most regularly evaluated.
X Designates least regularly evaluated.

Catalog
O Observe catalog use unobtrusively.
X Monitor computerized catalog use statistics.

Reference Service

O Observe reference staff performance unobtrusively.

X Study reference staff performance using a test set of
questions.

Collection

O Compare collection against recognized bibli-
ographies.

X Examination of collection by subject specialists who
assess the adequacy of the collection.

Materials Use

O Maintain statistics on circulation of materials outside
the library.

X Test document delivery success rate by use of
Document Delivery Test (DDT).

Bibliographic Instruction

O Survey patrons on bibliographic instruction (how
well it is presented, how important it is to patrons,
what can be done to improve it, etc.).

Measure effectiveness of bibliographic instruction by
a pre- and post-test study.

Physical Facilities

O Study facilities use (physical arrangement of mate-
rials, service points, furniture, equipment, etc.).

X Survey patrons on their evaluation of surroundings
(environmental climate, attractiveness, etc.)

Patron Use

O Compare hours of service with those of similar
libraries.

X Measure average time patrons spend in the library.

User Needs/Satisfaction

O Analyze feedback from library committee or
academic department liaison.

X - Request diary-keeping of a sample of library users,
describing library services needs/use.

Online Bibliographic Searching and Information

Retrieval

O Maintain use statistics of online searching.

X Study search performance by comparing a search
against ostandard� searches conducted solely for the
purpose of evaluation.





1, traditional quantitative activities dominate the
limited evaluation programs being performed in
the responding academic libraries, with few
reporting less traditional evaluation activities
Suggested in the literature. Administrators
apparently participate in less complex, easily
Collected statistical measures with little user
involvement. There was strong agreement among
institutions about the evaluation activities in
which they do and do not participate.

Second, an overwhelming level of agreement
exists for support of evaluation as an essential
activity, even if the administrators do not partici-
Pate extensively in evaluation (see Figure 2).
Academic library administrators indicate that
evaluation techniques are available and accept-
able to librarians, that the profession is mature
enough, and that there is sufficient commitment
to formulate methods for evaluation. Their
enthusiasm is restrained, however, by the lack of
reward by their institutional administrations.

Figure 3 shows that strong agreement also
exists on the importance of evaluative criteria
With unanimity among all library levels on the
Most and least important criteria for evaluating
academic library services. One interesting aspect
of the study is the fact that the most important
evaluative criteria are reflected in the least often
reported evaluation activities and the presence of

-.. the most important eval-
uative criteria are reflected
in the least often reported
evaluation activities ...

the least important criteria in activities in which
academic libraries most often participate.
Finally, there was high positive correlation, a
Meaningful relationship, indicated among atti-
tudes of the responding administrators toward
evaluation, their perceived importance of evalua-
tive criteria, and actual participation in evalua-
tion. It appears that those academic libraries
directed by administrators who indicate a positive
attitude toward evaluation and evaluative criteria
also participate in more evaluation activities. In
addition, most participation occurs in academic
libraries of medium size and budget, whose insti-
tutions are public and confer only bachelorTs and
Master's degrees. Interestingly, those libraries with
More automated functions participate in more
evaluation activities, suggesting that library auto-
Mation technology could be used to produce
evaluative information, as well as to provide an

FIGURE 2.

Academic Library AdministratorsT Attitudes Toward
the Evaluation of Library Services

Agreed Most Often (in rank order):

1. The evaluation of library services is an essential
activity.

2. The library profession is mature enough to
formulate valid evaluation methods.

8. Imperfect measures can be useful if their limita-
tions are appreciated.

4. To obtain useful administrative information,
libraries should not hire highly trained outside evalua-
tors to evaluate library services.

5. Evaluation techniques are available.

6. The use of non-threatening measures, such as
standard bibliographies and quantitative numbers in
statistical reports, are acceptable to the library staff.

7. Evaluation of library services is not over-empha-
sized today and counter-productive to the true mission
of library services.

8. Library services are not a complex bundle of
intangibles not amenable to evaluation.

9. The library staff does not resist library service
evaluation.

10. Formula for evaluation are not too complicated
for the mathematically uninitiated.

Agreed Least Often (in rank order):

1. Evaluation of library services is extremely
threatening to the library profession.

2. Each library is not unique and should not be
assessed in the context of its own particular history,
constraints, uses, and environment.

3. The subjective judgment of library professionals
should not be respected.

4, General professional consensus of the library
profession is not necessary to achieve a commitment to
evaluate library services.

5. Academic library administrators have been in
the dark ages far too long by failing to recognize the
critical importance of evaluation.

6. There are rewards from my institution for such
a management approach.

7. Libraries are no more varied than other organ-
izations where tools of management science have been
applied profitably.

8. Evaluation is a high level of concern in my
institution.

9. The difficulties in formulating universally
applicable measures for evaluation are not seemingly
insurmountable.

10. Evaluation should be the library manager's
watchword.

efficient delivery system for organizing and report-
ing this information, assuring better service to
library patrons.

Speculatively, as far as evaluation of academic
library services is concerned, bigger is not neces-
sarily better. Larger institutions may find difficulty
in initiating programs of qualitative evaluation
while small schools may be more able to maintain

Fall 1990"211





FIGURE 3.

Perceptions of Academic Library Administrators of
the Importance of Evaluative Criteria |

Most Important (in rank order):

1. The adequacy of the collection in supporting
curricular needs.

2. Interpersonal communication skills of the
members of the library staff.

3. The ability of the reference staff to answer
questions completely and accurately.

4. The maintenance of the collection and indexes
in an orderly arrangement.

5. The ability of the catalog and shelf arrangement
to disclose the holdings of particular items or materials
on particular subjects.

6. Job satisfaction of the members of the library
staff.

7. The maintenance of adequate hours of access
and professional staff assistance.

8. The provision of comfortable, attractive, quiet,
well-equipped facilities.

9. The ability of the bibliographic instruction pro-
gram to improve effective patron use of the library.

10. The provision of loan policies of optimal oppor-
tunity for students and faculty.

Least Important (in rank order):

1. The comparison of the collection against hold-
ings of other institutions.

2. The maintenance of reference assistance statis-
tics by counting and classifying inquiries.

3. The maintenance of statistics for circulation of
materials within the library.

4. The maintenance of statistics on the number of
patrons who use the library.

5. The speed with which a literature search can be
conducted.

6. The comparison of collection size with accepted
standards.

7. The comparison of seating and stacks facilities
with accepted standards.

8. The speed with which a reference inquiry can be
answered.

9. The maintenance of statistics for circulation of
materials outside the library.

10. The adequacy of the collection in supporting
faculty research needs.

... those academic libraries
directed by administrators
who indicate a positive
attitude toward evaluation
and evaluative criteria also
participate in more evaluation
activities.

212"Fall 1990

patron-oriented public services and evaluation
activities.

A review of evaluation literature indicates
that complex and dynamic criteria have been
introduced for the qualitative evaluation of library
services in a seemingly endless list. The identifica-
tion of acceptable measures, however, has proven
extremely difficult. The criteria presented in the
literature may be too complex to be useful, an
obstacle to its value to managers. It appears that
the criteria selected as a result of this research
synthesize prior theory and information, combin-
ing these with the expressed preferences of the
responding administrators. The resulting struc-
ture could be of value as the library profession
moves toward the adoption of an evaluation
program acceptable to academic library admin-
istrators.

The following evaluative criteria, selected by
the responding academic library administrators
in this study as the twelve most important criteria
for evaluating the effectiveness of academic library
services, are suggested as guidelines for formu-
lating appropriate evaluative criteria. Listed with
the guidelines/criteria are examples of suitable
evaluation activities for gathering the pertinent
information needed for evaluation.

Tired of making
~onermanent loans?�

CheckpointT

TomorrowTs Technology for TodayTs Libraries�"�

550 Grove Road ¢ P.O. Box 188 * Thorofare, New Jersey 08086
(800) 257-5540 * TELEX: 84-5396 * FAX (609) 848-0937

Wes Brewer, Sales Representative
2921 Welcome Drive

Durham, North Carolina 27705
(919) 493-2161





FIGURE 4.

Suggested Criteria for Evaluating the Effectiveness of
Academic Library Services with Evaluation Activities

1. The adequacy of the collection in supporting
curricular needs: (a) study distribution of funds for
collection by formula for individual subject fields, (b)
examination of collection by subject specialists who
assess the adequacy of the collections, (c) analyze feed-
back from library committee of academic department
liaison.

2. Interpersonal communication skills of the
members of the library staff: (a) survey patrons on their
evaluation of the personal assistance available for finding
information.

3. The ability of the reference staff to answer
questions completely and accurately: (a) maintain statis-
tics on proportion of questions answered correctly, and
(b) study performance of reference staff using a test set
of questions.

4. The maintenance of the collection and indexes
in an orderly arrangement: (a) survey patrons on their
use of the catalog as an information finding tool, and (b)
study materials accessibility (difficult or delay in obtain-
ing materials).

5. The ability of the catalog and shelf arrangement
to disclose the holdings of particular it ems of materials
on particular subjects: (a) same as 4a and (b) same as
4b.

6. Job satisfaction of the members of the library
staff: (a) survey staff members on the extent of their
Satisfaction with their positions as related to promotion,
personal growth, salary, duties, etc.

7. The maintenance of adequate hours of access
and professional staff assistance: (a) compare hours of
service with those of similar libraries, and (b) analyze
reference use patterns.

8. The provision of comfortable, attractive, quiet,
well-equipped facilities: (a) study facilities use (physical
arrangement opf ~materials, service points, furniture,
equipment, etc.); (b) analyze use of space for stacks and
seating by comparison with accepted standards; and (c)
Survey patrons on their evaluation of surroundings
(environmental climate, attractiveness, etc.).

9. The ability of the bibliographic instruction pro-
gram to improve effective patron use of the library: (a)
measure effectiveness of bibliographic instruction by a
pre- and post-test study; and (b) survey patrons on
bibliographic instruction (how well it is presented, how
important it is to patrons, what can be done to improve
it, etc.).

10. The provision of loan policies of optimal oppor-
tunity for students and faculty: (a) analysis of circulation
records, and (b) analysis of borrowing policy/privileges.

1l. The ability of the online bibliographic searching
staff to retrieve relevant citations/items: (a) request
user to indicate which retrieved citations/items are
relevant, and (b) survey patrons on their use of the
online search service to find information.

12. The ability of the interlibrary loan service to
meet user needs satisfactorily in a reasonable length of
time: (a) analyze proportion of interlibrary loan requests
Satisfied, and (b) assess time required to satisfy inter-
library loan requests.

Previous studies underscore the ability to
measure library effectiveness and the benefits of
qualitative measurement methods. Research
efforts have provided tools and methods for actual
decision making on measurement and evaluation
of effectiveness. No national standards have been
set, however, and there seems to be no move
toward general professional consensus on mea-
surement and evaluation of effectiveness. Library
administrators must explore all the possibilities
for a satisfactory tool to support, with more than
partial facts and figures, the previously intangible
worth, benefits, and effectiveness of libraries. It
will also be necessary for the library profession to
renew and affirm a commitment to and enthusi-
asm for the goal of truly effective library service,
strengthening its resolve to meet that challenge.

The true success of libraries must be mea-
sured by the services delivered to patrons. The
ultimate purpose of our libraries is to provide
information services. Evaluation can be a means
to that end.

References

K. E. Beasley, oCommentary.� Library Trends 22(1974):
387-93.

M. K. Buckland, oConcepts of Library Goodness.� Canadian
Library Journal 39(1982):63-66.

R. R. DuMont, oA Conceptual Basis for Library Effectiveness.�
College and Research Libraries 41(1980):103-11.

R. R. DuMont, and P. F. DuMont. oMeasuring Library Effective-
ness: A Review and an Assessment.� In Advances in
Librarianship, edited by M. H. Harris. New York: Aca-
demic Press, 1979.

N. C. Feldman, oCommentary.� Library Trends 22(1974):
395-401.

E. S. Gleaves, oThree Agendas for Research in Library and Infor-
mation Science.� Kentucky Libraries 49(1985):5-19.

oGuide to Methods of Library Evaluation.� College and Research
Libraries News 29(1968):n.p.

P. B. Kantor, Objective Performance Measures for Academic and
Research Libraries. Washington: Association of Research
Libraries, 1984.

B. Katz, and R. A. Fraley. Evaluation of Reference Services. New
York: Haworth Press, 1984.

F, W. Lancaster, The Measurement and Evaluation of Library
Services. Washington: Information Resources Press, 1977.

C. Martell, oEditorial: Performance at the Reference Desk.�
College and Research Library News 46(1985):3-4.

L. A. Martin,oCommentary.� Library Trends 22(1974):403-13.

R. H. Orr, oMeasuring the Goodness of Library Services: A
General Framework for Considering Quantitative Mea-
sures.� Journal of Documentation 29(1973):315-31.

V. E. Palmour, oPerformance Measures for Research Libraries.�
Minutes of the Ninety-second Meeting. Washington: Asso-
ciation of Research Libraries, 1978.

S. R. Reed, oIntroduction.� Library Trends 22(1974):253-55.

P. V. Rzasa, The Development of Measures of Effectiveness for a
University Library. Unpublished masterTs thesis, Purdue
University, 1969.

P. V. Rzasa, and N. Baker. oMeasures of Effectiveness for a
University Library.� Journal of the American Society for
Information Science 23(1972):248-53.

Fall 1990"2138





Southern Association of Colleges and Schools. Proceedings
35(1983):46-61.

Southern Association of Colleges and Schools. Standards of the
College Delegate Assembly. Atlanta, 1977.

Southern Association of Colleges and Schools. Commission on
Colleges. Criteria for Accreditation. Atlanta, 1984.

oStandards for College Libraries.� College and Research Librar-
tes News 36(1975):290-301.

oStandards for College Libraries.� College and Research Librar-
tes News 47(1986):189-200.

oStandards for College Libraries.� College and Research Librar-
ies News 40(1979):101-10.

R. W. Swanson, oDesign and Evaluation of Information Systems.�
In Annual Review of Information Science and Tech-
nology, edited by C. A. Cuadra. Washington: American
Society for Information Science, 1975.

J.C. Virgo, and D. A. Yuro. Libraries and Accreditation in Insti-
tutions of Higher Education. New York: Association of

College and Research Libraries, 1981. a
C

Upcoming Issues

Winter 1990 -Supporting the Support Staff
Harry Tuchmayer, Guest Editor
Spring 1991 - Law and the Library
Tim Coggins, Guest Editor
Summer 1991 - Young Adult Services
Rebecca Taylor and Gayle
Keresey, Guest Editors
- Library Buildings
Phil Barton and John Welch,
Guest Editors
Winter 1991 -Conference Issue
Spring 1992 - Anniversary Issue: History of
Libraries in N.C.
Robert Anthony, Guest Editor
Summer 1992 - Librarians and the
Political Process
Nancy Bates, Guest Editor
- Telecommunications
Bil Stahl, Guest Editor
Winter 1992 - Preservation of Popular Culture
Alice Cotten, Guest Editor
Spring 1993 - Ethics in Librarianship
Marti Smith, Guest Editor
Summer 1993 - ChildrenTs Services
Satia Orange and Cal Shepard,
Guest Editors
- Social Issues in Librarianship
Jane Moore, Guest Editor
Winter 1993 - Conference Issue

Fall 1991

Fall 1992

Fall 1993

Unsolicited articles dealing with the above
themes or any issue of interest to North Carolina
librarians are welcomed. Please follow manu-

script guidelines delineated elsewhere in this
issue.

214"Fall 1990

UI SAN anv SM ATE AS Rha EO Tea LS ISR A Baal DCL doe OE
Instructions for the Preparation

of Manuscripts

for North Carolina Libraries

1. North Carolina Libraries seeks to publish articles, book
reviews, and news of professional interest to librarians in
North Carolina. Articles need not be of a scholarly nature, but
they should address professional concerns of the library
community in the state.

2. Manuscripts should be directed to Frances B. Bradburn, Edi-
tor, North Carolina Libraries, Joyner Library, East Carolina
University, Greenville, N.C. 27858.

N.C. 27604.

3. Manuscripts should be submitted in triplicate on plain white
paper measuring 84� x 11�.

4, Manuscripts must be double-spaced (text, references, and
footnotes). Manuscripts should be typed on sixty-space lines,
twenty-five lines to a page. The beginnings of paragraphs
should be indented eight spaces. Lengthy quotes should be
avoided. When used, they should be indented on both mar-
gins.

5. The name, position, and professional address of the author
should appear in the bottom left-hand corner of a separate
title page.

6. Each page after the first should be numbered consecutively
at the top right-hand corner and carry the author's last name
at the upper left-hand corner.

7. Footnotes should appear at the end of the manuscript. The
editors will refer to The Chicago Manual of Style, 13th edition.
The basic forms for books and journals are as follows:

Keyes Metcalf, Planning Academic and Research Li-
brary Buildings. (New York: McGraw, 1965), 4/6.

Susan K. Martin, oThe Care and Feeding of the MARC
Format,� American Libraries 10 (September 1979): 498.

8. Photographs will be accepted for consideration but cannot be
returned.

9. North Carolina Libraries is not copyrighted. Copyright rests
with the author. Upon receipt, a manuscript will be acknowl-
edged by the editor. Following review of a manuscript by at
least two jurors, a decision will be communicated to the wri-
ter. A definite publication date cannot be given since any
incoming manuscript will be added to a manuscript bank
from which articles are selected for each issue.

Issue deadlines are February 10, May 10, August 10, and
November 10.







Selective Bibliography on
Library Performance Measures

Cynthia R. Levine

This selective bibliography is designed to pro-
vide a sampling of the vast literature on measures
of library effectiveness. This broad topic includes
Writings on performance measures, output mea-
Sures, library effectiveness, cost-effectiveness, and
library statistics. The subject is closely related to
library goals and objectives, against which library
effectiveness is often measured. Because of the
Wide-ranging nature of the topic, a comprehensive
bibliography is not feasible. I have chosen to
Concentrate on the reasons for measuring library
effectiveness, specific ways in which it can be
done, and how measures have been used in parti-
Cular types of libraries and for particular services.
Note that these categories are not mutually
�,�xclusive, thus many of the writings can fall in
More than one area. I have also chosen to restrict
the bibliography to relatively contemporary writ-
ings. With few exceptions, the items included were
Published in the 1980s.

Review Articles

These two recent review articles provide
introductions to the research on performance
Measurement, showing the development of the
topic over time. For additional information on
�,�arlier research, see Evans et al. (1972) listed in
the oMethods of Analysis� section of this bibli-
Ography.

Deborah L. Goodall, oPerformance Measurement:
A Historical Perspective.� Journal of Librar-
tanship 20 (April 1988): 128-45.

Nancy A. Van House, oOutput Measures in Librar-
ies.� Library Trends 38 (Fall 1989): 268-97.

Using Performance Measures for
Management Decisions

Performance measures are not made in a
vacuum. The writings by Blagden (1980), DuMont
(1980) and Orr (1973) discuss the rationale for
Performance measurement by showing how they

Syn ERO tae ae

Cynthia R. Levine is Reference Librarian for North Carolina
State University Libraries in Raleigh.

may be used by library managers to assist in
making decisions and in justifying those decisions.
The articles by Allen (1985), Christensen (1988),
and Hernon (1989) discuss the collection and
uses of statistics. Young (1989) gives an overview
of library statistics compiled by federal and state
governments as well as by library associations
and organizations.

General Introductions

John Blagden, Do We Really Need Libraries? New
York: Clive Bingley, 1980.

Rosemary Ruhig DuMont, oA Conceptual Basis for
Library Effectiveness.� College & Research
Libraries (March 1980): 103-11.

Stuart Hannabus, oThe Importance of Perform-
ance Measures.� Library Review (Winter
1987): 248-53.

R. H. Orr, oMeasuring the Goodness of Library
Services: A General Framework for Con-
sidering Quantitative Measures.� Journal of
Documentation 29 (September 1973):
315-32.

Library Statistics

Geoffrey G. Allen, oThe Management Use of Library
Statistics.� IFLA Journal 11 (1985): 211-17.

John O. Christensen, oUse of Statistics by Librar-
ians.� Journal of Library Administration 9,
no. 2 (1988): 85-90.

Martin M.Cummings, oCost Analysis: Methods and
Realities.� Library Administration &
Management 3 (Fall 1989): 181-83.

Peter Hernon, oResearch and the Use of Statistics
for Library Decision Making.� Library Ad-
ministration & Management 3 (Fall 1989):
176-80.

Peter R. Young,oU.S. Library Statistics.� Library
Administration & Management (Fall 1989):
170-75.

Methods of Analysis

This section covers specific techniques that
have been used to measure library effectiveness.
The oOverviews� include general discussions of a
variety of measures. The oSpecific Measures�

Fall 1990"215





section lists studies of specific ways to measure
availability of materials and the degree to which
users are able to locate and gain access to them
and to satisfy their information needs. Note that
the articles by DTElia (1985 and 1988) and Van
House (1988) were written in response to one
another and, by reading them in sequence, you
can follow the debate on the usefulness of a
particular measure called ofill rates.�

Overviews

Rosemary Ruhig Du Mont, and Paul F. Du Mont,
oMeasuring Library Effectiveness: A Review
and Assessment.� Advances in Librarian-
ship 9 (1979): 103-41.

Edward Evans, Harold Borko, and Patricia Fergu-
son, oReview of Criteria Used to Measure
Library Effectiveness.� Bulletin of the Medi-
cal Library Association 60 (January 1972):
102-10.

Philip M. Morse, Library Effectiveness: A Systems
Approach. Cambridge, Mass.: MIT Press,
1968.

F. W. Lancaster, [f You Want to Evaluate Your
Library. Champaign, Ill.: Graduate School
of Library and Information Science, Univer-
sity of Illinois, 1988.

G. Travis White, oQuantitative Measures of Library
Effectiveness.� Journal of Academic Librar-
tanship 3 (July 1977): 128-36.

Specific Measures

Thompson R. Cummins, oDemand Analysis: Inputs,
Outputs, Outcomes, and Productivity.�
Public Libraries 27 (Spring 1988): 10-13.

George DTElia, oMaterials Availability Fill Rates "
Useful Measures of Library Performance?�
Public Libraries 24 (Fall 1985): 106-10.

"-", oMaterials Availability Fill Rates: Additional
Data Addressing the Question of the Useful-
ness of the Measures.� Public Libraries 27
(Spring 1988): 15-23.

"", oA Response to Van House.� Public Libraries
27 (Spring 1988): 28-31.

George DTElia and Sandra Walsh, oUser Satisfac-
tion With Library Service: A Measure of
Public Library Performance.� Library Quar-
terly 53 (April 1983): 109-33.

Frederick G. Kilgour, oToward 100 Percent Avail-
ability.� Library Journal (November 1989):
50-53.

D. H. Revill, o ~AvailabilityT as a Performance Mea-
sure for Academic Libraries.� Journal of
Librarianship 19 (January 1987): 14-30.

Gene K. Rinkel, and Patricia McCandless.oApplica-
tion of a Methodology Analyzing User Frus-
tration.� College & Research Libraries

216"Fall 1990

(January 1983): 29-37.
W. M. Shaw Saracevic, and P. B. Kantor.oCauses
and Dynamics of User Frustration in an

Academic Library.� College & Research
Libraries 38 (January 1977): 7-18.

Nancy A. Van House, oIn Defense of Fill Rates.�
Public Libraries 27 (Spring 1988): 25-27.

" ", oA Response to DElia.� Public Libraries 27
(Spring 1988): 32.

Public Libraries

Much of the work on output measures has
focused on public libraries. Several manuals have
been developed to aid public libraries, and these
have inspired much of the discussion on the
general topic of output measures. Lynch (1983)
compares two of these publications, Performance
Measures for Public Libraries (1973) and Output
Measures for Public Libraries (1982). A second
edition of Output Measures for Public Libraries
was published in 1987. Childers and Van House
(1989) show the multifaceted nature of library
effectiveness by identifying sixty-one distinct indi-
cators that can be classed into eight separate
dimensions. They point out that procedures have
not been developed to measure many of the most
important of each indicators.

Manuals

Ernest DeProspo, et al. Performance Measures for
Public Libraries. Chicago: American Library
Association, 1973.

Nancy A. Van House, et al. Output Measures for
Public Libraries: A Manual of Standard-
ized Procedures. 2d ed. Chicago: American
Library Association, 1987.

Douglas Zweizig, and Eleanor Jo Rodger. Output
Measures for Public Libraries. Chicago:
American Library Association, 1982.

Discussion

Thomas Childers, and Nancy Van House. oThe
Grail of Goodness: The Effective Public
Library.� Library Journal 14 (Oct. 1, 1989):
44-49,

Mary Jo Lynch, oMeasurement of Public Library
Activity: The Search for Practical Methods.�
Wilson Library Bulletin (January 1983):
388-93.

Charles R. McClure, et al., oOutput Measures:
Myths, Realities, and Prospects.� Public
Libraries (Summer 1986): 49-52.

Jane Robbins, and Douglas Zweizig. Are We There
Yet? Evaluating Library Collections, Refer-
ence Services, Programs, and Personnel.
Madison, Wis.: School of Library and Infor-





mation Studies, University of Wisconsin,
1988.

Terry L. Weech, oValidity and Comparability of
Public Library Data: A Commentary on the
Output Measures for Public Libraries.�
Public Library Quarterly 8 (1988): 7-18.

Academic and Research Libraries

Kantor (1984) provides academic libraries
with a practical manual demonstrating a series of
Measures appropriate for evaluating academic
and research libraries. A new manual by Van
House et al. was published in summer 1990 and is
discussed in Tiefel (1989). The article by McClure
shows one of the difficulties in conducting these
measures: skepticism on the part of library staff
regarding the validity and uses of performance
Measures.

Manuals

Paul B. Kantor, Objective Performance Measures
for Academic and Research Libraries.
Washington, D.C.: Association of Research
Libraries, 1984.

Nancy A. Van House, et al. Measuring Academic
Library Performance: A Practical Ap-
proach. Chicago: American Library Associa-
tion, 1990.

Discussion

Mary J. Cronin, Performance Measurement for
Public Services in Academic and Research
Libraries. Washington, D.C.: Association of
Research Libraries, 1985.

Charles R. McClure, oA View from the Trenches:
Costing and Performance Measures for
Academic Library Services.� College & Re-
search Libraries 47 (July 1986): 323-36.

Virginia Tiefel, oOutput or Performance Measures:
The Making of a Manual.� College & Research
Libraries News 50 (June 1989): 475-78.

School Libraries

Evelyn H. Daniel, oPerformance Measures for
School Librarians: Complexities and Poten-
tial.� Advances in Librarianship 6 (1976):
1-51.

Special Libraries

The following article stresses the importance
of evaluating corporate libraries and recommends
modifying public library measures for this
Purpose.

Charles R. McClure, and Betsy Reifsynder. oPer-
formance Measures for Corporate Informa-
tion Centers.� Special Libraries 75 (July

1984): 193-204.

Reference Services

Evaluating reference services produces spe-
cial challenges because many aspects of reference
are difficult or inappropriate to quantify. These
issues are discussed in the sources listed below.

Peter Hernon, oUtility Measures, Not Performance
Measures, for Library Reference Service?�
RQ 26 (Summer 1987): 449-59.

Peter Hernon, and Charles R. McClure. Unobtru-
sive Testing and Library Reference Service.
Norwood, N.J.: Ablex, 1987.

Bill Katz, and Ruth A. Fraley. Evaluation of Refer-
ence Services. New York: Haworth Press,
1984. Also published as The Reference Li-
brarian 11 (Fall/Winter 1984).

Ronald R. Powell, oReference Effectiveness: A
Review of Research.� Library and Informa-
tion Science Research (1984):.4-19.

Interlibrary Loan

Thomas J. Waldhart, oPerformance Evaluation of
Interlibrary Loan in the United States: A
Review of Research.� Library & Information
Science Research 7 (1985): 313-31.

Cataloging

Measurement in the area of cataloging has
focused on two areas, cataloging costs and the
relationship between quality and quantity. The
best introduction to this topic is Mandel (1988).

George Harris, oHistoric Cataloging Costs.� Library
Quarterly 59 (January 1989): 1-21.

Carol A. Mandel, oTrade-offs: Quantifying Quality
in Library Technical Services.� Journal of
Academic Librarianship 14 (September
1988): 214-20.

Richard Reeb, oA Quantitative Method for Evalu-
ating the Quality of Cataloging.� Cataloging
and Classification Quarterly 5 (Winter

1984): 21-26. nl

Counterpoint
(Continued from page 219)

age each and every one of us to care enough about
the job we do to do it well. You can't mandate
exceptional performance, but you can achieve it
with a lot of hard work and a commitment to
excellence. You donTt get that commitment with
good performance measures, you get it with good
management skills.

Fall 1990"217





POINT/

Performance Measures: The Pursuit
of Excellence and Accountability

Jerry A. Thrasher

Did the computer system maintain your
expected response time during the performance
test? Which airline has the best on-time arrival
record? Which stock has the best earnings ratio?
Performance measures are used universally to
make decisions and evaluations. This is true in
oneTs personal and business life. Why should it
be any different in public institutions like
libraries?

If you donTt have goals or specific objectives,
how do you know if you have accomplished
the job or if you have even gone in the right
direction? And if you donTt have performance
evaluations, how do you know you are doing a
good job? Performance measures are an excellent
tool to determine how you are doing.

It is also important to remember that we are
all accountable to someone. We are hired to
perform a particular job and to do that job satis-
factorily or better. How do we know when we are
performing well? When our co-workers notice,
when our boss tells us, or when we can prove it?
All are important, but the latter gives substance
to the former and are helpful to both the super-
visor and the employee.

Acceptable measures should be explored and
tested within your library. Although they may not
have been in writing, staff performance measures
have always existed. How long should it take to
shelve a full book truck in the adult nonfiction
collection? How long does it take to catalog and
process a book truck of new best sellers? Such
measures go a long way in improving performance
and letting staff know what is expected.

Benchmarks can be established based on
experience and through a process of joint explor-
ation. Having realistic performance measures is
far better than relying on a supervisorTs whim. If
performance measures do not exist, both the

Jerry A. Thrasher, former North Carolina Library Association
SELA Representative, is the Library Director for the
Cumberland County Public Library & Information Center in
Fayetteville.

218"Fall 1990

employee and the supervisor should work togeth-
er as a team to establish them. Supervisors should
keep in mind that the excellent employee who has
been doing this job for five years will have a
different performance level than an employee
who has just been hired. The level of performance
will be less and may never reach the current level
of expectation.

At another level, the process of developing
performance measures can also help the super-
visor justify requests for additional staff, equip-
ment and other resources. A manager needs
reliable information to justify budget requests to
help staff serve their library constituency better. I
believe that the more quantified that information
is, the greater the likelihood of obtaining increased
funding. If funding is not forthcoming, then the
information is also available to reevaluate existing
functions or services that need to be modified or
dropped to live within your approved budget. If
you can get increased performance from your
staff and increased funding from your host organ-
ization without some form of performance or
workload measure, more power to you. I would
like to know what you are doing.

If you feel your employees can offer the library
and their community more, or you are not getting
the level of funding you think you should, perform-
ance and workload measures may be able to help.
But they will only help if it is a cooperative effort
between staff and management to reach the
library's mission and/or goals.

In any case, pursuing the process will gener-
ate important data to demonstrate that you are
doing a good job with the resources allocated to
the library. The pursuit of meaningful perform-
ance measures is also the pursuit of excellence
and accountability.







COUNTERPOINT

Performance Measures CanTt
Quantify Quality

Harry Tuchmayer

Do performance measures really work or are
they just another obstacle dreamed up by admin-
istrators to make your life more difficult? After
all, you were hired to do a good job in a profes-
sional manner, so why does your boss insist on
holding you and your entire department up to
some abstract standard barely obtainable? Sound
familiar? It should, because it speaks to the under-
lying problems with performance measures "
mistrust and misunderstanding.

Staff, whether professional or support, fear
Standards. Now donTt get me wrong, that doesnTt
mean that they aren't interested in doing a good
job. They are! They just know that the oreal
reason� we set standards is to document poor
performance, not to reward good performance.
So what exactly are administrators really after
when they attempt to measure performance? Are
they setting realistic objectives for each depart-
ment for the coming year? Are they attempting to
document performance of individual employees
for the purpose of evaluation? Are they really just
Measuring the level of activity in the library in
order to justify next yearTs budget request? Your
answers to these questions have a lot to do with
how receptive you are to performance measures.
The fact of the matter is, staff mistrust standards
because they donTt understand how they will be
used; they fear output measures because they
donTt understand why such statistics are collected.

Does this mean that performance measures
are a waste of time? Perhaps not. Knowing how
many carts can be shelved in an hour, how many
books can be cataloged or processed in a month
and how many bibliographies should be produced
this year can help supervisors and employees set
appropriate goals. However " and this is the
difficult part " they need to be realistic and flex-
ible benchmarks that encourage performance
rather than create fear in the workplace. All too

Harry Tuchmayer, the editor of Point/ Counterpoint, is Head-
Quarters Librarian for the New Hanover County Public Library
in Wilmington.

often administrators establish measures in a
vacuum, handing down goals and objectives as if
they were dictated levels of achievement that
each department is expected to meet. Instead of
involving the individual employee in the process
of measuring output for the purposes of estab-
lishing objectives, the instrument and its results
are handed over to the immediate supervisor as a
fait accompli ready to be adopted and acted
upon. Performance measures must be developed
and standards set by administrators, supervisors
and staff if they are to have any value to the
organization. Otherwise, you run the risk of
creating an environment where individuals do
whatTs expected of them, and no more. Thus,
instead of setting standards for excellence you've
created a cop-out for mediocrity.

All this is perhaps easier said than done.
Structuring output measures that work takes
time. They require a commitment on the part of
everyone involved to honestly evaluate what can
be done and how it can be achieved. It takes a
willingness on the part of administrators to accept
staff input and an acceptance on the part of staff
that administrators really do have their best
interests at heart. In the end, it takes a certain
degree of trust that the objective is to improve
service, not to penalize staff. Only in an environ-
ment of mutual trust and understanding can we
even begin to address why we need performance
measures.

So why do we need performance measures?
Is it to determine which is the best library in the
state or to encourage each of us to make our
libraries even better than they already are? Is it to
give administrators something to do in their office,
or is it to help create a process of communication
between administrators and their staff? Is it to
prove something to the rest of the library world,
or to prove something to ourselves?

We donTt really need performance measures.
Instead we should be working on ways to encour-

(Continued on page 217)

Fall 1990"219







Library Research in North Carolina

Jinny Y. Davis, editor

One source of library scholarship in North
Carolina is the masterTs paper required at the
three accredited library schools at the University
of North Carolina at Chapel Hill, North Carolina
Central University, and the University of North
Carolina at Greensboro. While the masterTs paper
generally does not approach the doctoral disser-
tation in the level of research skill required or the
scope of the topic, completing such a project does
instill in the student an understanding of what it
means to conceive, research, execute, and docu-
ment a research project.

Even a quick glance at the lists of masterTs
papers written at these three institutions reveals
the wealth of potential topics for further research.
A large number of these papers also use libraries
in North Carolina as their laboratories for study.
Once written, however, most masterTs papers
languish unpublicized at their library schools.

The purpose of this column is to draw atten-

tion to masterTs papers that are worthy of wider
attention " papers that were identified by the
library faculty at UNC-Chapel Hill and at NCCU as
meritorious pieces of scholarship. At UNC-Greens-
boro, Dr. Marilyn Miller reports that the faculty is
restructuring the masterTs paper process. By
requiring a research methodologies course that is
closely tied to the writing of a masterTs paper, they
hope to improve the research skills of their stu-
dents and the quality of the papers produced.

220"Fall 1990

UNC-Chapel HillTs School of Information and
Library Science has recognized outstanding mas-
terTs papers since 1977, when, by a gift of the
Rockwell Fund, Dr. Edward Holley was able to
establish the DeanTs Achievement Award for the
best masterTs paper in any one year. Winners of
the award receive a $100 check and formal recog-
nition in the UniversityTs commencement program.
Members of the library faculty nominate notable
papers, and a faculty committee selects the
winners. In the eighties, the award was split into
two categories, with winners designated for both
a best ogeneral� paper (usually on some area of

traditional librarianship), and a best otechnical�
paper (usually on a topic in the area of informa-

tion science). Although many M.LS. graduates
have published works based on their masterTs

papers, no comprehensive attempt has been made
to keep track of all subsequent publications.

Winners of the DeanTs Achievement Award
over the last five years, and the titles of their
papers, are listed below.

1990 Nan Marie McMurray. Sufficiently
Ornate: Librarians and Library
Architecture, 1876-1900.

Andrea Louise Rohrbacher. Monitoring
Adverse Drug Reaction Reports Using
Commercial Medical Databases.





1989 Joel Brett Sutton. MIRA: A Prolog-Based
System for Musical Information
Retrieval and Analysis.
Paul Staley Williford. Study in Gray:
Information Needs among Older
Adults in Shelby County, Tenn.

1988 Leslie Carol McCall. Organization of
Musical Pedagogical Materials.
Daniel Gordon Wheeler. Investments in
Brittle Books.

1987 _S. Diane Shaw. A Study of the Collabora-
tion between the Scholar Erasmus of
Rotterdam and His Printer Froben at
Basel during the Years 1514 to 1527.

Deborah K. Barreau. Automated
Reserves System for a Special Library.

1986 Susan Elizabeth Bello. A History of Plans
for Cooperative Preservation Efforts
Involving Academic Libraries.

James Timothy Shaw. Personal Narra-
tives as Sources for the History of the
Spanish Civil War.

David Anthony Day. An Automated Bib-
liographic System for 19th-Century
Opera Librettos.

At NCCUTs School of Library and Information
Sciences, Dean Benjamin Speller reports that the
faculty has awarded an Outstanding Research
Award since 1976. Winning masterTs papers are
Selected for the significance of their topic and the
�,�xecution of the research methodology. Although
No financial stipend is attached, the winners are
Tecognized at a university-wide award day in the
Spring. NCCU does track publications by its grad-
Uates and reports twenty-five articles or papers
Published in the professional literature by its
MLS. graduates since the late seventies.

Award-winning masterTs papers at NCCU over
the last five years were:

1990 Christopher D. Forney. The Acquired
Immune Deficiency Syndrome: A
Bibliometric Analysis: 1980-1984.

1989 Elizabeth Janet Gardner. A Study on the
Existence of Library Materials Recon-
sideration Policies and Procedures in
Public School Systems of North
Carolina.

1988 Margaret P. Brill. Government Docu-
ments as Bibliographic References in
the Periodical Literature of Inter-
national Relations: A Citation
Analysis.

1987 Desire E.Volkwijn. Censorship in Schools:
L. B. Woods Updated.

1986 Susan G. Kundin. A Comparison of the
Treatment of Adolescent ~Problem-
ConcernsT in Formula Romance
Fiction and Contemporary Realistic
Fiction.

All the works listed above are available via
interlibrary loan. Readers interested in the topics
or in seeing examples of good scholarship in
North Carolina will want to take a look at them.

cl

Subscription Order

Please place mailing label
from your issue here.

North Carolina Libraries is published four
times a year by the North Carolina Library
Association. Subscription: $32 per year; $50
foreign countries. Single copy $10. Address
new subscriptions, renewals, and related
correspondence to Frances B. Bradburn, edi-
tor; North Carolina Libraries, Joyner Library,
East Carolina University, Greenville, NC
27858 or call (919) 757-6076. (For member-
ship information, see address label on jour-
nal)

Fall 1990"221





o_. . keeps getting better.TT*

Ferris Beach

a8 6 6. 6 el ne ee, 6 er eae een Orr 6 inlet a, le, ety Oc ve 10.1 6o eee: leks 6 6h te ere Ole) Oe OL mene Oe Ob OLS bee ae ml

The story of a time and a place, Ferris Beach tells of a young Southern girlTs
coming-of-age in the 70s. It tells of a love that bridges social classes, of
confronting Southern propriety, of courting the forbidden. It is Jill McCorkle at

her soaring best.

oA marvelous follow-up to McCorkleTs acclaimed Tending to Virginia. . .
Despite a few occasions of real tragedy, what predominates is McCorkleTs deft
comic sense, her keen ear for dialog and eye for detail.�

"Library Journal (starred review)

oA commendable balance of tragedy and mirth . . . the full texture of a childTs
wonder and terror is preserved.T "Booklist*

Algonquin Books of Chapel Hill
a division of Workman Publishing Co., Inc.
307 West Weaver Street
Chapel Hill, North Carolina 27510

222"Fall 1990

-

os







North Carolina Books

Robert G. Anthony, Jr., Compiler

Carolyn Sakowski. Touring the Western North
Carolina Backroads. Winston-Salem, N.C.: John
F. Blair, 1990. 305 pp. $14.95. ISBN 0-89587-077-0
(Paper).

Ironically, the author's desire to promote an
appreciation of the isolated beauty and the quaint
communities that have survived highway develop-
ment in western North Carolina will result in
increased traffic by readers of her inspirational
Suide. SakowskiTs great contribution to the book-
Shelf"if not the back seat of the car"is: 1)
latching onto a rich concept; 2) applying an eye
for delicious anecdote; and 3) delivering crisp
Prose. The problems with the book hardly tip the
Scales of an enthusiastic recommendation.

The tours require setting aside a whole day
for each of twenty-one excursions, which range in
length from twenty-one to 100 miles, many of
which whiz by without authorial comment. The
Suide is best read first in an armchair, referring to
Simplified maps and noting opportunities to hike,
Picnic, play, and gape. Punctuated by explicit
road directions, a fluid succession of stories play
On important motifs: the mystery of nature as
Seen through Cherokee legends (drawn mostly
from James Mooney); the exploitation of water,
mineral, timber, and climate; and the horror and
charm of pioneer ambition.

The better chapters have themes rather than
lists of sites connected by roads. The tour of
Haywood and Madison counties paints a picture
of the old Buncombe Turnpike that lives up to the
legend of oBloody Madison,� including century-old
hearsay about a drover who tucked the blud-
geoned body he found in his hotel room into his
bed as a decoy to escape a murderous innkeeper.
Sakowski makes effective use of quotations, add-
ing to MadisonTs ambience with the WPA North
Carolina guideTs depiction of Marshall: oone mile
long, one street wide, and sky high.�

Sakowski ranges wide. In Marble, she re-
Creates the cutterTs experience; in Swain County,
She reveals how Horace Kephart scientifically pin-
Pointed his retreat to desolate wilderness;
approaching Cullowhee, she stoops to reconstruct

the life of a man who made a fortune transplant-
ing goat glands into men desirous of potency. She
popularizes remote material. In Avery County,
she explains the ardor of eighteenth-century
botanists by comparing competition among
nations for unusual flora to the space race today.

The biggest disappointment is that she does
not provide insight into the lay of the land"
which is what one mostly sees"and that she
misses opportunities to dramatize current eco-
logical concerns. There is no mention of the effect
of acid rain on the Fraser firs on Mount Mitchell,
which has, among other things, created an other-
worldly scene.

The text refers to further reading, such as
Jules VerneTs novel set at Table Rock, but the
bibliography is hastily assembled. The Trail of
Tears story intensifies a few chapters, but the
bibliography does not include John EhleTs work of
that name. An opportunity to enlarge upon geog-
raphy is missed by not including Harry MooreTs A
Roadside Guide to the Geology of the Great Smoky
Mountains. William Bartram is quoted, but his
Travels is not cited. The bibliography is not
organized thematically to present a usable read-
ing list. An index and a list of helpful agencies
are just adequate. The book's format is attractive;
a wide left-hand margin provides space for sub-
headings and photographs, but the photographs
are horribly small and dark.

Sakowski does not prove herself to be a pain-
staking scholar in this effort; however, she is a
reliable traveler, a homegrown appreciator, and a
wonderful storyteller. The bottom line is, she has
produced a unique resource that will be de-
manded and cherished by residents of and visitors
to the region.

Rob Neufeld, Asheville-Buncombe Library System

Catherine W. Bishir, Charlotte V. Brown, Carl R.
Lounsbury, and Ernest H. Wood III. Architects
and Builders in North Carolina: A History of the
Practice of Building. Chapel Hill: University of
North Carolina Press, 1990. 540 pp. $37.50. ISBN
0-8078-1898-4.

Fall 1990"223





North Carolina Books

Histories of the practice of building, unlike
histories of architecture, are relatively uncommon.
Though related, there are important differences
between the two genres. Histories of architecture
tend to emphasize styles, significant buildings,
and important architects. Histories of building
practice are less concerned with what buildings
look like, than with how they were built.

Perhaps the authors of Architects and Build-
ers in North Carolina best summarize their book:

oThis book is about the people who built North CarolinaTs
architecture. It describes how the practice of building
changed from traditional craft to complex industry.
Although there have been many studies of segments of
the history of American building practice, this is the first
work to look at the builders as a whole"artisan and
architect, contractor and manufacturer, slave and free,
rural and urban"and to trace the history of building
practice from early settlement to the present.... And,
although it suggests the outlines of the larger national
picture of building practice, this is a story rooted in a
single place " North Carolina" and a story that emerges
directly from the personal sagas of hundreds of indivi-
duals laboring at thousands of building sites across this
long-rural state.� (Introduction.)

With copious quotations from contemporary
documents, the authors of this important book
present a comprehensive account of how buildings
of all types came to be built in North Carolina. The
relationships between client and builder, the effect
of technology and improvements in transporta-
tion on the availability and cost of building mate-
rials, the skills, pay, and working conditions of
white and black artisans are covered chrono-
logically from a beginning chapter on construction
practices of the seventeenth century to the final
chapter on builders and architectural firms of the
1980s.

The scholarly apparatus is impressive. Fifty-
six pages of notes and a twenty-nine page bibliog-
raphy reveal the large number of manuscript and
printed sources consulted. Extensive research
was conducted in manuscript collections at Duke
University, the University of North Carolina at
Chapel Hill, and the North Carolina Division of
Archives and History.

Catherine W. Bishir, director of the North
Carolina Architecture Project of the Historic
Preservation Foundation of North Carolina, has
written a number of articles on North Carolina
topics. Her book, North Carolina Architecture,
will be published in November 1990. Charlotte V.
Brown directs the visual arts program at North
Carolina State University. Carl Lounsbury, an
architectural historian with the Colonial Williams-
burg Foundation, wrote his dissertation on
changes in the building process in North Carolina

224"Fall 1990

in the nineteenth century. Ernest H. Wood writes
on architectural subjects for Southern Living. Re-
search assistance was provided by J. Marshall
Bullock and William B. Bushong.

The book is well designed, with legible type, a
conservative page layout, and sufficient margins
for rebinding, should that become necessary.
There are 155 illustrations, somewhat gray in
reproduction but adequate for the purposes of
this book.

Architects and Builders in North Carolina
will be particularly valuable for architectural
historians and for social and economic historians,
both for specific facts and for its broad overview
of building practice. General readers and students
interested in architecture, the built environment,
and North Carolina history will find it useful as
well. It is strongly recommended for academic
and public libraries.

Philip Rees, University of North Carolina at Chapel Hill

Gloree Rogers. Love, or a Reasonable Facsimile.
Durham: Carolina Wren Press, 1989. 160 pp. $7.00.
ISBN 0-932112-27-7 (paper).

Gloree Rogers's first novel is autobiographical,
telling the story of a black girl growing up in North
Carolina poor, handicapped, and trapped in
damaging relationships.

The second of six children, Gloree is born in
Bladen County with multiple birth defects, lacking
pubic bone and vaginal opening, with her bladder
out side her body, and with deformed legs. After
numerous operations at Duke Hospital, the child
learns to walk at age seven and is able to begin
school, where the children make fun of her.

When Gloree is nine, her mother moves the
family to Durham, where she lives with a succes-
sion of boy friends. Some of these men pity the
little girl and are kind, but as she nears adoles-
cence the men provoke her mother to beat her,
and her life is constantly filled with violence. The
neighborhood is no safer, as old Mr. Billy trades
sacks of candy for sexual favors. oYa ainTt gonna
tell nobody, is ya?�

After high school, Gloree moves to New York
to work as a live-in maid. She becomes pregnant
just months after having an operation to rebuild
her vagina. Abandoned and jobless, she gives birth
to a baby girl, then finds she has signed it over for
adoption without realizing it. Recovering the baby,
she moves back to Durham where she marries,
divorces, has a series of painful relationships with
abusive men, marries and divorces again, and
continues the dismal cycle.





oI share the tragedies of my life in the hope
that no living being will have to repeat these
experiences except vicariously through the writ-
ten word,� writes Gloree Rogers. Yet the tragedies
of cruelty, poverty, and ignorance are not ones
that are easily escaped. RogersTs story rings true,
and evokes in the reader strong emotions of
Outrage, pity, disgust, and hopelessness. The
Writing, however, is clumsy in places, with stiff
dialogue, inconsistency of style, and a lack of
character development. In spite of these flaws,
the book received first place in the 1988 Carolina
Wren-Obsidian II minority book contest for North
Carolina.

This book is not about glorious achievements.
It is about surviving, getting by, and settling, not
for love, but a reasonable facsimile. It will be
appropriate for academic collections in womenTs
Or black studies, and, in spite of the explicit
Sexual detail, for public libraries.

Lisa Dalton, Rockingham County Public Library

Chris Florance. Up From Mount Misery: The
Blossoming of North CarolinaTs Sandhills. Ashe-
boro, N.C.: Down Home Press, 1990. 211 pp. $19.95.
ISBN 0-9624255-3-2.

On the founding of Southern Pines, N.C., circa
1884, newspaperman and former clerk of the
North Carolina House of Representatives John D.
Cameron described the quality of the land of the
Sandhills in these words:

There is no more barren or poverty-stricken belt in the
State except Mount Misery near Wilmington, yet to this
region Mr. Patrick has given the name of Southern Pines,
a place where a pea vine will not grow and a grapevine
Cannot sprout. A sand bank where even the thinly
Scattered pine trees are stunted, where the wire grass
Stands in meager clumps, few and far between, and the
white sand is marked with drifts of pine straw washed
together by the summer floods that are not swallowed
up by the thirsty desert.

This description was familiar to author Chris
Florance, a former history teacher and orna-
Mental horticulturist, who grew up in the area
and graduated from Ellerbe High School in 1926.
From personal knowledge as well as primary and
Secondary resources, Mrs. Florance tells the story
of the arrival of a few wealthy and well-educated,
Mostly young, northerners in the early 1900s who
Saw both opportunity and potential in parts of an
eight-county area called the Sandhills.

These visionary men bought and cleared land,
built fine homes, planted crops, and started peach
Orchards. Because of their influence and minor

North Carolina Books

success in their ventures, friends were encouraged
to come. Community schools were established for
their poorer neighbors, a legacy that reached
beyond the area and that was more successful
than their agricultural pursuits. The dreams and
struggles of these men, shared eventually by the
native population, gradually saw the blossoming
of an area thought to be forever doomed because
of its poor soil.

The book is divided into four parts, three of
which bear the names of Roger Alden Derby,
Morris Randolph Mitchell, and Frederick Taylor
Gates, men who in the 1920s came to live and
work in the Sandhills. All three were from promi-
nent families who as individuals had been friends
to or were relatives of such persons as Franklin D.
Roosevelt, Grover Cleveland, Teddy Roosevelt,
Walter Hines Page, Raphael Pumpelly, Dr. James
Albert Broadus, Richard Loverling, and John D.
Rockefeller.

Drawing heavily upon the unpublished mem-
oirs (1935) of Roger Alden Derby, Mrs. Florance
creates an interesting and at times intimate
account of these men and their families, their
relationships with and contributions to the people
of the area, and their successes and failures as
entrepreneurs. Interwoven into this historical
account of the oCounty Families,� the term used to
describe this colony of northerners, are various
personal stories and reflections that could only be
put in proper context by a writer who knows her
subject. One such account is that of the small
farm family, the fictional Chases, where one can
experience the life of a poor but proud Sandhills
family of the early 1900s.

The last part of the book, oSandhills Memo-
rabilia,� includes poems by Roger Derby, Clyde L.
Davis, and Raphael W. Pumpelly II, and letters
from prominent persons such as Walter Hines
Page and Ethel Roosevelt Derby. These writings all
deal with the physical characteristics of the area
itself or with life in the region.

Complementing the text are well-chosen pic-
tures of persons and places discussed. The style of
the author makes the book very readable, and
public and academic libraries will want to add it
to their collections. The paper board binding,
however, will not survive many circulations. The
book should have much appeal to lovers of the
Sandhills and would make an excellent gift for
resident or visitor. Mrs. Florance has also auth-

ored the award-winning book, Carolina Home
Gardener (UNC Press, 1976), now out of print.

Gary Fenton Barefoot, Mount Olive College

Fall 1990"225





North Carolina Books

Daniel W. Patterson and Charles G. Zug III, eds.
Arts in Earnest: North Carolina Folklife. Dur-
ham: Duke University Press, 1990. 319 pp. $42.50.
ISBN 0-8223-0943-2 (cloth), $18.95. 0-8223-1021-

X (paper).

oThe exploration of folklore, then, is not an antiquarian
pursuit; it leads directly into earnest intellectual, social,
and human issues.� (p. 3).

This quotation from the introduction to Arts
in Earnest gives the reader a hint of what is to
come. If, like this reader, you have considered
folklife studies merely the documenting of quaint
stories and customs from the past, you are in for a
surprise when you read this book. Mischief on the
factory floor, house design, tall tales told by frater-
nity boys, the chant of the tobacco auctioneer,
and the aesthetics of duck decoys are now fit
subjects for contemporary North Carolina folklor-
ists. All these topics are included in this volume,
as are more traditional subjects such as quilt-
making, storytelling, religion, and music.

All fifteen essays in Arts in Earnest are based
on both fieldwork and library research. In each
essay, the author attempts to go beyond simply
describing a story or a practice to an examination
of the meaning of the activity for the performer
and his or her community. Several of the essays,

particularly those on music, show the authorsT -

training in other subject fields. The essay by
Thomas Carter and Thomas Sauber on the New
River Valley String Band may be difficult for those
who are unable to read music, but all of the other
essays are very accessible. Laurel HortonTs article
on quilts in antebellum Rowan County is a model
of clarity, and the essay by Stephen Matchak on
wildfowl decoys succinctly covers the social and
economic history of the northeastern North Caro-
lina coast while discussing the decoy tradition of
the area. John ForrestTs article complements
MatchakTs by revealing the aesthetics of decoys
and the relation of aesthetic achievement to status
among duck hunters. The book includes essays on
both black and white folk culture, and covers all
geographic regions of North Carolina.

The authors of the fifteen essays are former
students in the Curriculum in Folklore at the
University of North Carolina at Chapel Hill; the
editors are faculty members in the program. The
introduction by the editors provides an excellent
review of the history of folklife studies. Most of the
articles are illustrated, endnotes are included for
all, and there is an index.

This is a scholarly book that can function as
an introduction to modern folklife studies. It is
also a readable volume that will be enjoyed by
library patrons interested in North Caroliniana.

226"Fall 1990

Recommended for academic libraries and larger
public and high school libraries.

Eileen McGrath, University of North Carolina at Chapel Hill

Reynolds Price. The Tongues of Angels. New
York: Atheneum, 1990. 192 pp. $17.95. ISBN
0-689-12093-1.

In his recent autobiographical work Clear
Pictures, Reynolds Price describes the flood of
early memories retrieved during and following
hypnotherapy sessions which were part of his
treatment for spinal cancer. In PriceTs eighth
novel, The Tongues of Angels, the reader feels
that the author is sharing the memories and
feelings of his younger self more directly than in
any of his previous fiction.

The novelTs narrator and protagonist, Bridge
Boatner, is the artist that Price realized at a
young age that he would never be. Bridge has a
great deal to say about his philosophy of painting
in this novel, and many of his comments can be
equally well applied to the art of the novelist. Still
closer to home, Bridge like Price lost his father at
age twenty-one, and was left to be othe man at
bat� in his family. This is not a new theme in
PriceTs work: Milo Mustian of A Generous Man
found himself in the same position, as did Kate
Vaiden. Here, however, Price creates a young man
with talents similar to his own, sets him down in
his own lifetime, and gives him the same summer
job that he held himself one year in the early
1950s"counselor at a boysT camp in the North
Carolina mountains. As Bridge moves through
that summer consciously seeking to bury his
father and the memories of his own helplessness
in the face of his fatherTs death pains, any reader
with similar memories is likely to feel that Price
has written more autobiography than fiction here.
Beyond these parallels, Price writes in Clear
Pictures that the events in BridgeTs summer are
completely fictional.

BridgeTs duties at Camp Juniper included
teaching art classes, writing and editing the camp
newsletter, and tending a cabin full of lively ten to
twelve-year-old boys. Two important things hap-
pened to him that summer: he completed his first
significant painting, oThe Smoky Mountains as the
Meaning of Things,� and he made a friend and lost
one in fourteen-year-old Raphael Noren, a veteran
camper with extraordinary talents as an Indian
dancer, and a tragic history.

oITm as peaceful a man as you're likely to meet
in America now,� an older Bridge begins the
narrative, obut this is about a death I may have
caused.� This opening sentence ensures the





readerTs attention to the very end of the story, but
also sets up a letdown once he gets there. Bridge
does not cause anyone's death by any stretch of
the imagination. He fails to anticipate and direct
an unforeseeable and uncontrollable event in
another personTs life, and with the pride of youth
Prefers to call himself guilty rather than helpless.
Watchfulness, attention, looking at things and
People in loving detail form the basis of Bridge
BoatnerTs art as well as his faith; and when his
young friend slips out of his sight, he faults
himself.

Guilt, then, is a major theme of this novel, and
it is echoed in Bridge's feelings about the Indian
lore which forms much of the basis of camp life.
The title suggests redemption, and signs of re-
demption abound throughout the book. Bridge
reminds us that angels are messengers bringing
news, and that Jesus taught that we are to watch
for the messages. Angels are a favorite subject for
his drawings and paintings, and he tells us that
his first fame will come from a series of angel
Studies. He asks Rafe to pose for him, remembering
With a shiver that Raphael was an archangel. He
describes the boy Rafe as having oother-worldly
looks,� a ocredible Angel Gabriel� who oenters a
real room ten-foot square and greets the girl
Tising to meet him in the dim far corner, ~Hail
Mary, full of grace!T� RafeTs message seems to
Come as he dances around the campfire, becoming
the eagle he portrays as Bridge watches.

As much as he appreciates RafeTs gifts and
Message, Bridge has ambitions to be a messenger
in his own right. Michelangelo and van Gogh, he
Says, omeant every picture as a forthright message,
to change menTs souls.� He means the landscape
he completes that summer in the same way,
Seeing a coded message that just barely eludes
him in the rhythm of the mountainous panorama.
It is while studying it that he feels he misses his
Chance to save Rafe.

One last allusion to angels comes in an almost
Parenthetical bow by Price to Thomas Wolfe.
Bridge and a fellow counselor take a sort of
pilgrimage to the Wolfe home in Asheville on a day
off, and the impressionable Bridge nearly comes
to the rescue of a forlorn young unwed mother
who is in charge of the place.

Duke University professor Reynolds Price has
Written eight novels and several volumes of short
Stories over the last thirty years. He has created
Many memorable characters reacting to unusual,
often overwhelming, circumstances in the midst
of mundane surroundings. In this latest novel and
m his autobiography, he has given his readers a
8reat deal of himself. Recommended for school,

North Carolina Books

public, and academic libraries.

Dorothy Hodder, New Hanover County Public Library

Other Publications of Interest

For students of the history of religion in the
Tar Heel state, George W. PaschalTs History of
North Carolina Baptists is indispensable. This
two-volume work [published 1930 (Vol. 1) and
1955 (Vol. 2) ] provides a comprehensive exami-
nation of the Baptist presence in North Carolina,
from arrival in the late seventeenth century
through the mid-twentieth. By the Civil War, as
Paschal explains, Baptists associated with the
Baptist State Convention had become the largest
denomination in the state, and members of that
church have continued to play a major role in Tar
Heel religious life. Long out of print, PaschalTs
study has recently been reprinted by Church
History Research and Archives (220 Graystone
Drive, Gallatin, Tenn. 30766) and is available for
$54 (set). Included in the reprint volumes (601
pp., 578 pp. hardback) are greatly expanded
indexes, with more than sixteen thousand refer-
ences to individuals, churches, and religion-
related subjects. A list of additional church history
and theology titles, some of which are offered at
discounts to libraries, may be requested from the
publisher.

Thomas Wolfe longed to be a playwright and
applied his genius to writing for the stage early in
his literary career. He is, however, best known for
his long autobiographical novels. Yet some critics
believe several of his short stories to be among his
best work. Some of WolfeTs short fiction initially
appeared in magazines and was later incorpor-
ated into his novels. In From Death to Morning, he
collected fourteen stories. Others were drawn
from his manuscripts and published posthumous-
ly. In The Complete Short Stories of Thomas
Wolfe, editor Francis E. Skipp has gathered fifty-
eight Wolfe stories, thirty-five not collected before
and one published for the first time. This volume,
first published in 1987, is now available in paper-
back (1989; Collier Books, Macmillan Publishing
Co., 866 Third Avenue, New York, N.Y. 10022; 621
pp.; $12.95; ISBN 0-02-04891-9).

The final volume in William R. TrotterTs trilogy
on the Civil War in North Carolina focuses on the
conflict in the stateTs coastal region. In Ironclads
and Columbiads: The Coast, Trotter details the
struggle for control of strategic railroads and
canals, the sinking of the ironclad ram Albemarle,
the battle for Fort Fisher, and other activities

Fall 1990"227





North Carolina Books

along the coast, which was the scene of more
fighting than all other parts of the state combined.
For reviews of the two previous volumes, Silk
Flags and Cold Steel: The Piedmont (Vol. 1) and
Bushwhackers!: The Mountains (Vol. 2), see North
Carolina Libraries 47 (Summer 1989): 126-127
and (Winter 1989): 262-263, respectively. (Iron-
clads and Columbiads, Vol. 3; 1989; Piedmont Im-
pressions, P.O. Box 29364, Greensboro, N.C. 27429;
456 pp.; $19.95; ISBN 0-9293307-05-4; cloth.)

With Carolina Follies: A Nose-Tweaking
Look at Life in Our Two Great and Goofy States,
veteran Charlotte Observer reporter Lew Powell
offers incontrovertible proof that Foot-in-Mouth
Disease frequently victimizes Carolinians, and that
this behavior can be unintentionally hilarious or
simply hard to believe. Borrowing the idea from
EsquireTs oDubious Achievements Awards,� Powell
since 1977 has annually published in the Observer
a year-end review of absurdities and faux pas
from the Carolinas, an area he labels oa satiristTs
paradise.� In Carolina Follies, he has collected
over two hundred of his favorite quotations and
summaries of oscrewball news,� providing the
reader a laugh-filled look at othe very best in
foibles and foolishness.� (1990; Down Home Press,
P.O. Box 4126, Asheboro, N.C. 27204; 96 pp.; $6.95;
ISBNO-9624255-1-6; paper.)

North Carolina Giving:
The Directory of the StateTs Foundations

North Carolina Giving is
the most complete, authoritative guide
to the stateTs more than 700 private
charitable and community foundations.
It is a vital resource for nonprofit
organizations and institutions, or for
anyone seeking grants.

The eleventh in the series of short county
histories published by the Historical Publications
Section of the North Carolina Division of Archives
and History, Cumberland County: A Brief History
offers a concise but informative account of the
heritage of one of North CarolinaTs more historic
counties. Author Roy Parker, Jr., editor of the
Fayetteville Times, ranges widely, from economic
to social to political topics. Scottish Highlander
settlements; the development of Fayetteville as a
political, economic, and cultural center; the
destruction of an important Confederate arsenal;
and the establishment of Fort Bragg are but a few
Cumberland highlights included. (1990; Historical
Publications Section, 109 East Jones Street,
Raleigh, N.C. 27601-2807; 158 pp. $6.00, plus
$2.00 postage; ISBN 0-86526-243-8; paper.)

The Historical Publications Section has also
recently published Volume XII in its acclaimed
North Carolina Troops, 1861-1865: A Roster
series. This volume covers the Forty-ninth through
Fifty-second Regiments, North Carolina Troops,
Confederate infantry. In addition to the roster of
soldiers, compiler Weymouth T. Jordan, Jr., pro-
vides unit histories. (1990; Historical Publications,
109 East Jones Street, Raleigh, N.C. 27601-2807;
565 pp.; $27.00, plus $3.00 postage; ISBN 0-86526-
017-6 (Vol. XII), 0-86526-005-2 (series); cloth.)

NORTH CAROLINA

eEMNE

North Carolina Giving
provides all the information that is
needed to easily identify appropriate
funding sources. The directory is cross-
referenced with indexes by county, areas
of interest and board members, saving
you countless hours of research.

The Directory of the State's Foundations

By Anita Gunn-Shirley
Published by Capital Consortium, Inc.
1990 Edition

Order your copy of this

limited edition today.

Please sendme_________ copiesof + Name:
North Carolina Giving at $99.00 percopy. "itie.

ISBN: 0-9624910-0-4

Enclosed is my check for $ Organization:

Address:

Return to: North Carolina Giving, Capital

Consortium, PO Box 2918, Raleigh,
North Carolina 27602 919/833-4553 Phone:

228"Fall 1990







NCLA Minutes

North Carolina Library Association
Minutes of the Executive Board
April 20, 1990

The Executive Board of the North Carolina Library Associa-
tion met Friday, April 20, 1990, at 9:00 a.m. in the Educational
Resources Building of Durham Technical Community College.
Prior to formally calling the meeting to order, President Barbara
Baker introduced Doris Anne Bradley, Chair of the Constitution,
Codes, and Handbook Revision Committee, who presented the
Review of Parliamentary Procedures which had been scheduled
for the January 25 meeting. Previously mailed to the Executive
Board were the NCLA Constitution and Bylaws, a description of
the Executive Board and its duties, and the NCLA calendar of
dates and deadlines during the biennium. These will become
Part of the new edition of the NCLA Handbook.

President Baker called the meeting to order at 10:00 a.m.,
Welcomed guests C. Betina Morris from the Dept. of Administra-
tion and Leonard Sherwin, who represents Friends of N. C.
Public Libraries, relayed apologies from Leland Park and
Howard McGinn, who could not attend, announced changes to
the agenda, and welcomed the Board to Durham Technical
College. Present at the meeting were Doris Anne Bradley, Martha
Fonville, Reneé Stiff, Laura Benson, Pat Siegfried, Susan Janney,
Jane Moore, Robert Gaines, Martha Ransley, Leonard Sherwin,
Steve Sumerford, Joanne Abel, David Gleim, Pat Langelier,
Johannah Sherrer, Frances Bradburn, Janet Freeman, Melanie
Collins, Nancy Ray, Karen Seawell, C.Betina Morris, Nancy Bates,
David Harrington, Michael LaCroix, Sylvia Sprinkle-Hamlin,
David Fergusson, Barbara Baker, and Amanda Bible.

Minutes of the January 24-25 meeting were corrected to
insert obylaws� between osection available,� to include Art Weeks
as present, to correct spelling of Robert Gaines, Robert Reid, Bil
Stahl, and Martha Ransley. Minutes were approved as corrected.

Jane Moore, reporting for State Librarian Howard McGinn,
Stated that the ACC psa videos had received an award from the
National Commission on Libraries ard Information Services,
and that funds are still being solicited to cover the cost of the fall
Series, Plans for the nine regional conferences that will precede
the Governor's Conference on Libraries were presented. LSCA
has been authorized by Congress for the next five years, but the
funding is unknown at this time.

Because of the state budget short-fall in revenue, the Dept.
of Cultural Resources is expected to return $3 million, but it will
Not come from aid to public libraries. Eunice Drum is retiring
and two positions, Information Specialist and General Institu-
tional Consultant, are vacant. The State Library will have a
booth at ALA for recruitment and promotion of library services.
Library brochures and volunteers to staff the booth were
requested. Comments were made about the changes in ILL
Service through In-WATS for community college and small
independent public libraries. :

Treasurer Michael LaCroixTs report showed $3,202.03 in the
checking account, $77,826.45 in Certificates of Deposit as of
March 31. January-March disbursements totaled $107,516.74
and all sections have credit balances. The audit report has been
received and is available for examination. NCLA is in good

financial shape. It was noted that any money from the NC
Humanities Council for the Books of America program, including
accrued interest, that is not spent on the program will have to
be returned.

Administrative Assistant Martha Fonville distributed SELA
brochures and bookmarks, announced that the membership
database is almost set up, so that mailing labels and membership
statistics will be available. She noted that there has been about
45% renewal of members since January and twenty new mem-
bers. A calendar of all meetings is being maintained to aid in
planning and to avoid conflicts. David Gleim asked if it would be
possible to add the Executive Board to all of the organizations
within NCLA to improve communication. This was approved by
consensus.

Frances Bradburn, Editor of North Carolina Libraries,
reported that the spring issue is scheduled to be mailed this
week. The publisher changed from off-set typesetting to desktop
publishing with this issue. The small savings from this change
was offset by the change to acid-free paper. A report of the
upcoming issues through Winter 1993 was distributed. A discus-
sion of why NCL did not receive a John Cotton Dana award
again this year, even though the substance of the publication is
good, followed. Possible format changes to improve the appear-
ance of the journal were suggested, but it was noted that any
changes would increase the cost, and it currently costs approxi-
mately $17.00 per member. If members feel that changes should
be made, this should be communicated to Nancy Fogarty,
Finance Committee Chair, who is working on the 1991-92 budget.

Committee Reports

Janet Freeman, Conference Committee Chair, presented a
report with three site proposals for the 1993 biennial confer-
ence: Raleigh, High Point, and Winston-Salem. After discussing
the merits of each site, the Executive Board voted to accept the
Winston-Salem proposal for October 19-22, 1993.

Guidelines for preparing NCLA bulk mail and a charge-back
schedule, prepared by Janet Freeman and Martha Fonville, was
presented. This will be included in the new Handbook. Doris
Anne Bradley announced, with credit to Martha Fonville, that
the new edition of the Handbook should be ready before the next
Executive Board meeting in July. The Committee is to meet
again on May 10. An addition will be a compilation and explana-
tion of all awards given by NCLA and the various sections. An
amendment to the Constitution is needed to require the biennial
audit, which is being done, but it is not in the Constitution nor
Bylaws. Also, the membership year for those who join in the last
quarter of the biennium needs to be clarified.

President Baker, reporting for Nancy Fogarty, Finance Com-
mittee Chair, stated that both the old and new committees had
met and that the committee would be meeting on May 10 to
consider two grant proposals.

Dave Fergusson, Governmental Relations Committee Chair,
reported that $400 had been contributed to supporting the ALA
Legislative Day and registration of $12.00 each had been paid for
the 18 members who will be attending. Appreciation was ex-
pressed to Bob Ward for much of the planning. Briefly discussed

Fall 1990"229





NCLA Minutes

was the LSCA program and the federal literacy program, which
is supported by President Bush.

Tina Morris, from the Department of Administration, pre-
sented information about the Literacy Partnership Conference,
oPutting the Pieces Together,� which is to be held July 27-28 at
the Four Seasons in Greensboro. Governor Martin will be the
keynote speaker. The Dept. of Cultural Resources, State Library
will be co-sponsoring the conference.

Pauline Myrick was absent, but a Nominating Committee
report was presented by President Baker. David Fergusson and
Augie Beasley were nominated for the SELA Representative
position. There were no further nominations and the report was
approved unanimously. The ballots will be mailed to the admin-
istrative office and will be due June 1. The Nominating Commit-
tee also distributed a form requesting suggestions for 1991-93
officers, to be received before their May 4 meeting.

Art Weeks, Chair of the Public Relations Committee, has left
the state to become director of the Finger Lakes Library System
in Ithaca, NY. The Committee completed the oNight of a Thou-
sand Stars� project for National Library Week and completéd
the video psa featuring Robert Reid of the Charlotte Hornets.

Pat Siegfried, Chair of the ChildrenTs Services Section, re-
ported that the programming publication, Reel Readers, is
selling well and has already shown a profit of over $200. The
section agreed to present the ALA Notables Showcase at the
NCASL Conference in September. The CSS Board will be working
with the NCASL Committee studying the possibility of a North
Carolina childrenTs book award. The section will sponsor a
membership reception, an author breakfast and a booktalking
program at the NCLA Conference.

Martha Ransley, Chair of the College and University Section,
announced a workshop, oNetworking: The Challenge of Working
Together,� planned for May 11 at Elon College. Jerry Campbell is
to be the featured speaker.

Community and Junior College Section Chair, Susan Janney,

reported that the section sponsored a program, oCD-ROM for
Reference Services,� at the NCCCLRA Conference in High Point
on March 21 which was moderated by NCLA President Baker.
Pat Richardson has been selected as NC Libraries editorial
representative. Mike McCabe will serve as the chairman of the
sectionTs new public relations committee.

Bob Gaines, Documents Section Chair, reported that over
1,000 announcements of the spring workshop, oThe United
States Census Bureau and the NC Data Center: Statistical
Products for the 70Ts,� scheduled for May 18 in McKimmon
Center at North Carolina State University, had been mailed. The
section met April 2 to discuss the upcoming GovernorTs and
White House Conference and to prepare a list of issues, which
was submitted to Diana Young, Conference Coordinator. The
section expressed concern about the proposal of Secretary of
State Rufus Edmisten to distribute a basic package of state
publications and information to all junior and senior high school
libraries in the state. The section is communicating with the
Secretary to suggest that NCASL and the Documents Section be
brought into this discussion and to point out that with more
than 700 libraries involved, this plan would need ample funding
and excellent preparation. The Secretary has applied for a
Reynolds Foundation Grant to fund this distribution. Also
suggested was the possible creation of a video featuring the
publications to be distributed.

Nancy Ray reported that the executive committee of the
Library Administration Section had met February 27 and March
23, and plans are underway for a fall workshop, oManaging in a
Time of Financial Uncertainty.�

Melanie Collins, New Members Roundtable Chair, reported
their board met April 9 in Lillington and decided to participate
in the ALA/JMRT Outreach Program in 1990 by making presen-
tations to the new classes in library schools during the fall
semester and to plan a conference program in 1991. They are

230"Fall 1990

seeking an affiliate to ALA since the current affiliate is moving
out of the state.

Laura Benson, NCASL Chair, reported that the biennial
conference will be held September 27-28 in High Point. Three
members will be attending the national Legislative Day. The
section is planning, with the ChildrenTs Services Section, to
sponsor a childrenTs book award. The next board meeting will be
May 11 in Greensboro.

Nancy Bates, Chair of the Public Library Section, reported
that the Planning Council met February 9 in Lexington. Two
committees, Automated Services and Literacy, were eliminated
since their concerns were being addressed by NCLA committees.
Adult Services is sponsoring a bookmobile workshop April 30-
May 1 in Greensboro. The Audiovisual Committee is preparing
an AV Directory/Resource Guide and will sponsor another
equipment repair workshop. The Governmental Relations Com-
mittee, with the Public Library Directors Association, will spon-
sor a oThank You� endeavor for legislators during the upcoming
short session of the General Assembly. Dave Fergusson, Chair of
the NCLA Governmental Relations Committee, will serve on the
coordinating committee. The Personnel Committee is planning
activities with library schools to attract qualified people to the
profession and also plans to address the issues of pay equity and
recertification. The Public Relations Committee plans hands-on
workshops, an ongoing swap and shop, and a conference
speaker, as well as presently assisting with the oNight of a Thou-
sand Stars� nationwide effort encouraging family reading. Robert
Reid of the Charlotte Hornets and artist/author Bob Timberlake
will be featured in North Carolina television public service
announcements. Governor Martin has issued a Family Literacy
Proclamation for the promotion. The Young Adult Committee
reported on its successful publication, Grassroots, the home-
work workshop and the oBest YA Materials� bibliography. The
next meeting of the Planning Council will be May 4 in Lenoir at
the Caldwell County Public Library.

Johannah Sherrer, Chair of Reference and Adult Services,
reported that a workshop is planned for September 28 in
Winston-Salem which will emphasize the art of reference and
the use of technology. An attitudinal survey of library directorsT
expectations of reference service will be conducted by mail prior
to the program and will be reported as a prelude to the
program. The section has another task force, reference accuracy
improvement. Based on the Maryland model of training refer-
ence staff, it is a program set up for training trainers. It is an
on-site training program based on the theory that behavioral
aspects determine the success of reference transactions. The
task force is to report on June 1 on the feasibility of offering this
program in North Carolina. The section has a collection develop-
ment proposal to relate collections and reading recommenda-
tions from groups such as the Cancer and Heart Associations
and to put their findings on the electronic bulletin board.

David Gleim, Chair of Resources and Technical Services
Section, reported that their executive committee has met twice.
Minor revisions need to be made to the RTSS Bylaws to conform
to ALA changes. The exact wording will be sent to the Chair of
the Constitution, Codes and Handbook Committee. Other
activities of the section were planning for issue No. 2 of the
section newsletter, NCLA/RTSS Update, planning for the fall
RTSS Conference to be held October 25-26 at the Durham Hilton
Hotel on customized versus standardized technical services, and
deciding to fund the printing and mailing costs of a directory of
NC curriculum materials centers. This is a project of Joanna
Wright, Head of Special Services at UNC-Wilmington and a
member of the RTSS executive committee.The committee will
seek LSCA Title III funding for partial expenses of the
conference.

Reneé Stiff, Roundtable on Ethnic Minority Concerns Chair,
reported their executive board met February 22 at A. & T.
University and projects discussed for the biennium include a





Program to provide management and leadership training, espe-
Cially for minority librarians, spearheading a project that will
result in a publication on the state of minority librarianship in
NG, sponsoring two workshops during the biennium, and setting
a regular publication schedule for the newsletter. April 27 is the
Next meeting date for the board.

President Baker reported that Maury York, Chair of Round-
table on Special Collections, was not able to attend, but she
Noted that a grant proposal for a project had been submitted to
the Finance Committee.

Karen Seawell, President of the Roundtable on the Status of
Women in Librarianship, reported that the executive board met
February 6 in Rockingham to formulate plans for a workshop,
oPlateauing: How to Tread Water Without Going Under, A Life
Saving Workshop by the RTSWIL,� to be held August 9-10 at
Forsyth County Public Library, and to develop a publication
Schedule for MsManagement.

Terri Union, Chair of the Trustees Section, could not attend
but asked President Baker to remind the Board of the Trustee
Conference May 18-19 at the Hilton Hotel in Durham.

Patricia Langelier, ALA Councilor, reported that she would
be attending the annual ALA Conference in Chicago June 23-28
and would report at the next Executive Board meeting.

Jerry Thrasher, SELA Representative, sent a report with

information about a possible chartered bus from Raleigh to
Nashville for the Biennial Conference December 4-8 at Opryland
Hotel, The cost would be $99 per member. Total cost would be
$2,635 for the bus, $385 for a registration flyer, and $247 for
refreshments and favors. Pickup points would be Raleigh,
Greensboro, Winston-Salem, and Asheville, departing Raleigh
December 4 and returning December 8. Laura Benson moved
that Jerry proceed with the plans. Seconded by Dave Fergusson,
the motion carried unanimously. The theme of the conference
Will be oSouthern Harmony: Libraries in Tune for the Future.�
The SELA report showed that NC has the second largest mem-
bership with 162 members. Florida announced that they would
Not be able to host the 1994 conference since ALA would be
Meeting in Florida that year. It was suggested at the March 2-3
Leadership Workshop in Atlanta that the SELA states work
together on the most important issues at the White House
Conference, and that they could vote as a block on issues of
Mutual interest. The SELA states represent 136 votes, 22% of the
total votes at the WHCLIS.
: President BakerTs report on activities she has attended
cluded the SELA Leadership Workshop in Atlanta March 2-3.
North Carolina was suggested as a possible site for the 1994
SELA conference. Barry Baker, Chair of the SELA Site Selection
Committee, contacted President Baker about the possibility of a
Joint conference. Since this would be the year for the NCASL
Conference, the suggestion was referred to their executive
board. Mr. Baker has contacted Charlotte and is going, with
President Baker, to visit Winston-Salem, also a possible site. Also
attended was the NC Association of High School Library Assist-
ants Conference, of which NCLA is a sponsor. The Membership
Committee had a display at this conference. President Baker has
been asked to be the banquet speaker for the Durham County
Library Association meeting and has been asked to serve on the
Steering committee for the Governor's Conference on Libraries
and Information Services.

New Business:

Guests Joanne Abel from Durham County Library, and Steve
Summerford from Greensboro Public Library reported that
there is a movement afoot to create a new roundtable, Social
Responsibilities, which would represent the same interests as
the ALA Social Responsibilities Roundtable. A petition had been
Signed, but it did not have the required 100 valid signatures.

NCLA Minutes

After much discussion on the purpose of the roundtable and the
procedure dealing with the formation, it was suggested that no
action be taken until there was a request with the required
number of signatures.

Leonard Sherwin, Treasurer of the Friends of NC Public
Libraries, announced that the Friends would be co-sponsoring
the Trustee Conference May 18-19 in Durham, that dues would
be increased from $5 to $10, and mentioned ways the Friends
help public libraries.

Melanie Collins moved that NCLA spend $2,500 to fund one
of four television public service announcements featuring ACC
football players promoting libraries and reading which would be
shown during football season. Seconded by Nancy Ray. Dave
Fergusson suggested that there be more emphasis on libraries
and librarians. The psa production is coordinated by the State
Library. Jane Moore will relay the suggestion. Motion carried
unanimously.

There was a question about a report from the Scholarship
Committee on the status of the McLendon Loan fund. A report
should be given at the next meeting.

David Gleim suggested that it would be helpful to have an
announcement in NCL or Tarheel Libraries, in the issue pre-
ceding the ALA election, of NCLA members who are running for
an ALA position. It was agreed by consensus that Jane Moore
would suggest this to the editor of Tarheel Libraries.

The July 20 meeting will be at Asheville-Buncombe Technical
College, and the October 19 meeting will be at the Seahawk Hotel
in Morehead City.

Meeting adjourned at 1:20 p.m.

aD
om

Amanda Bible, Secretary

Read

Fall 1990"231





NCLA Minutes

American Library Association
Annual Conference Report
June 23-28, 1990, Chicago, IL
July 20, 1990

Three Council meetings were held. Action was taken on a
variety of issues. Summary information on most of the resolu-
tions adopted by Council at the Annual Conference is included
in this report. Please let me know if you would like a copy of any
Council Document mentioned. I'll be glad to mail a photocopy to
you.

Implementation of a Midwinter 1990 Council motion: In
response to Tribute to 100 Years of ChildrenTs Rooms in Public
Libraries (Tribute #2), commemorative posters and self-stick
note pads were produced to mark the centennial of the estab-
lishment of childrenTs rooms in public libraries. Items can be
ordered through the ALA Graphics Department.

The most notable resolution which passed: CD #90 Reso-
lution on Smoking in Open Meetings of ALA: oTherefore, be it
resolved that Article 7.1.5 of the ALA Policy Manual be replaced
with the following statement: ~Smoking is prohibited in open
meetings and programs sponsored by ALA units during ALA
conferences and midwinter meetings.T �

ALA Executive Director Linda Crismond delivered her
report to Council: ALA now has 50,575 members. Conference
attendance reached an all-time high of 19,868. Linda would like
to visit ALA chapters in every state and welcomes invitations.

ALA Awards of Interest to North Carolina Librarians

ALA General Awards: Baber Research Grant to Evelyn H.
Daniel, School of Information and Library Science, UNC-CH for
oInformation Services to Small Businesses from Public Libraries.�

Reference and Adult Services Division: Dartmouth Medal
to Encyclopedia of Southern Culture (University of North Caro-
lina Press, 1989). Gale Research Award for Excellence in Busi-
ness Librarianship (BRASS) to Diane C. Strauss.

American Association of School Librarians: Bill Backer
Memorial Scholarship to Wilma H. Bates, Greensboro City
Schools. National School Library Media Program of the Year
Award (Large School District) to the Greensboro City Schools.

Association for Library Collections and Technical Services:
ALCTS Resources Section/Blackwell North America Scholarship
Award to Joe A. Hewitt. The oBest of LRTS� Award/ALCTS to Joe
A. Hewitt.

Library Administration & Management Association: (John
Cotton) Dana Public Relations Award (with the H. W. Wilson
Co.) to Public Library of Charlotte and Mecklenburg County, Inc.

Major Council Documents Adopted at Annual Conference:
Many Council Documents are distributed throughout the year.
Council Documents can be ALA Committee reports, letters,
memoranda, information sheets, background papers, agendas,
directories, rules, procedures, guidelines for preparing resolu-
tions, status reports, plans, policies, press releases, progress
reports, Executive Board reports, memorials, and resolutions.
The following 1989-90 Council Documents (CDs), were approved,
adopted or accepted at Annual Conference 1990. They are listed
in order of consideration.

CD #30 Resolution on Midwinter Meeting Purposes: oThe
ALA Midwinter Meeting is convened for the primary purpose of
expediting the business of the Association through sessions of
its governing and administrative delegates serving on boards,
committees and Council. Programs designed for the continuing
education and development of the fields of library service shall
be reserved for Annual Conference except by specific authori-

232"Fall 1990

zation of the Executive Board acting under the provisions of the
ALA Constitution.�

CD #47.1 Report to The ALA Council on Editorial Policy
From the Committees on Publishing, Intellectual Freedom, and
Professional Ethics.

CD #83 Policy Monitoring Committee report to Council:
Notable action taken includes incorporation into the ALA Policy
Manual the following new policies adopted by Council in
January 1990: 50.10 NCLIS Membership and Appointments. The
ALA supports the appointment of members of the NCLIS in an
expeditious manner with appointees who fully meet the require-
ments of the statute.

50.13 Environmental Issues. The ALA urges librarians and
library governing boards to collect and provide information on
the condition of our Earth, its air, ground, water and living
organisms from all available sources.

Notable action taken includes incorporation into the ALA
Policy Manual the following revised policies adopted by Council
in January 1990:

53.1.3 Access to Resources in the School Library Media
Program. Students and educators served by the school library
media program have access to resources and services free of
constraints resulting from personal, partisan, or doctrinal dis-
approval and which reflect the linguistic pluralism of the
community.

53.1.1 Challenged materials. Challenged materials which
meet the criteria for selection in the materials selection policy of
the library should not be removed under any legal or extra-legal
pressure.

53.1.11 Diversity in Collection Development. A balanced
collection reflects diversity of materials, not equality of numbers.
Collection development responsibilities include selecting mate-
rials in the languages in common use in the community which
the library serves...

53.1.2 Expurgation of Library Materials. Expurgation of any
parts of books or other library resources by the library, its agent,
or its parent institution is a violation of the Library Bill of Rights
because it denies access to the complete work, and therefore, to
the entire spectrum of ideas that the work was intended to
express.

CD #61.5 Freedom to View. The ALA endorses the Freedom
to View, a statement of the (Educational Film Library Associa-
tion) American Film and Video Association.

CD #86 Resolution on the Reauthorization and Reapprop-
riation of the National Endowment for the Arts and the National
Endowment for the Humanities: oResolved, that the ALA strong-
ly urges the members of the U.S. Congress to resist any limita-
tions or reductions of the appropriations for the NEA, the NEH |
or the Institute of Museum Services on the basis of doctrinal
disapproval of projects funded by the Endowments... .

CD #79 Report to Council of ALA President Patricia W.
Berger.

CD #91 Chapter Status for the Guam Library Association:
oResolved that in accordance with Constitution Article X, Section
8, and Bylaws Article V, the ALA Council approve the application
for Chapter status in the American Library Association from the
Guam Library Association.�

CD #98 ALA Committee on Minority Concerns: report to
Council.

CD #88 Planning Committee: report to Council.

CD #82 Intellectual Freedom Committee: report to Council.

CD #93 Resolution on oFair Use� of Unpublished Sources:
oResolved, that the ALA express its support and urge Congress
to enact legislation which would eliminate the distinction be-
tween published and unpublished materials with regard to the
fair use of quotations.�

CD #94 Resolution concerning drastic reductions in thé
budgets of military libraries: oResolved, that the ALA again





urge the President of the United States, the U.S. Congress, and
the Heads of military departments and agencies to seek other
Means to control expenditures rather than to close, reduce, or
Contract-out libraries and information centers.�

CD #95 Resolution concerning the need for expanding
Public access to the U.S. Department of Education Research
library: oResolved that the ALA recommend that public access
and use of the Department of Education Research Library be
�,�xpanded by broadening the scope of the Library's mission and
by Providing additional resources.�

CD #80 Disaster Relief Committee report.

CD #77 National Library Week resolution: oResolved the
ALA sponsor ~The Great American Read AloudT during National
Library Week and School Library Media Month each year.
Resolved that the ALA Public Information Office and National
Library Week plan and implement this national event. Resolved
that all types of libraries " school, public, academic, military
and special " be urged to participate in the annual ~Great
American Read Aloud.�

CD #87 Resolution on the Use of American Library Associa-
tion Name and Logo(s): oResolved that the Executive Director
Prepare a plan for the use of the ALA logo and related devices.�

CD #89 Resolution on Closing of Schools of Library and
Information Science: oResolved, that this Association endorse
the Columbia School of Library ServiceTs call for establishment
of a broad-based Special Commission by the ALA which would
�,�xamine the issues addressed in the report of ColumbiaTs
Provost that led to eliminating this pioneering school, review the
record of previous closings of library and information science
Programs to determine if a general pattern is discernable, and
4ssess the general impact, of the closings of the several schools;
and be it further Resolved, that this commission report its
findings to the ALA Council by June 1991.�

_ Membership Doc. #1 Poor PeopleTs Services Policy Resolu-
tion: oResolved, that the ALA adopt the following policy on
Library Service to Poor People, modeled on the ALA Minority
Concerns Policy . . . The American Library Association shall
Implement these objectives by: (15-item list " please let me
know if you want a copy). Council referred this resolution to
ALATs Access to Information Coordinating Committee.�

CD #92 Committee on Organization report.

CD #97 Resolution on The Starvation of Young Black Minds:
The Effects of Book Boycotts in South Africa: oResolved that the

reaffirm its current policies and not endorse the AAP
Teport� (which recommends lifting of boycott against South
Africa),

CD #82.7 Resolution in Opposition to the Anti-Obscenity
Pledge Requirement of the National Endowments for the Arts
and Humanities.

CD #82.8 Resolution in Support of Dennis Barrie and the
Contemporary Art Center of Cincinnati: oResolved, that the
Council of the ALA on behalf of its more than 50,000 members
honor and support Dennis Barrie and the board of trustees of
the museum for their leadership and courage in resisting censor-
Ship and their commitment to the free expression of ideas in the
face of extreme personal risk .. .�

NCLA Minutes

CD #82.9 Resolution on Flag Burning: oResolved that the
ALA expresses its support and appreciation for the vote in the
US. House of Representatives and the U.S. Senate to uphold free
expression as provided in the Bill of Rights, by defeating the
proposed constitutional amendment on flag burning.�

CD #104 Resolution on Higher Education Act Reauthoriza-
tion: oResolved that the ALA supports reauthorization of the
Higher Education Act with the following components: Title I-A,
Academic Library Technology and Cooperation Grants, Title
II-B, Library Education, Training, Research and Development,
Title II-C, Strengthening Research Library Resources, Title II-D,
College Library Technology and Cooperation Grants, Title IV-C,
Student Assistance, Work-Study Programs, Title VI, Interna-
tional Education Programs Part A, Section 607, Periodicals
Published Outside the U.S., Title VII, Construction, Reconstruc-
tion, and Renovation of Academic Facilities.�

CD #106 Resolution on Government Publications Discon-
tinued or Endangered: oResolved, that ALA and its units use all
normal channels of communication to alert ALA members and
the general public of the dangers associated with the loss of
these and other information resources; and be it further re-
solved, that ALA and its units use all normal channels of com-
munication to alert U.S. government executive agencies and the
appropriate Congressional committees of the value of these
information resources to the American people.�

CD #107 Resolution on the Defense Management Improve-
ment Act of 1990: oResolved, that the ALA urge Congress to
delete section 216 of the Defense Management Improvement Act
of 1990 that permits the Defense Department to bypass the
printing procedure requirements of 44 USC 501, 502.�

CD #109 Resolution of Support for Library of Congress
Appropriations for FY 1991: oResolved, that the ALA take
immediate action to support and to encourage public support
for the Library of Congress fiscal year 1991 appropriation
request.�

CD #100 Resolution on Postponement of Changes to the
Annual Conference Skeleton Schedule: oResolved, that the Coun-
cil direct the ALA Executive Board to postpone implementation
of conference scheduling changes until the 1992 Annual Confer-
ence in San Francisco.�

CD #101 Resolution for Support of the International Youth
Library, Munich, Germany: oResolved, that the ALA go on
record supporting the international focus of the International
Youth Library.�

CD #103 Resolution on Preservation Cooperation: oRe-
solved, that the ALA work actively through appropriate channels
to expand and strengthen international programs of coopera-
tion to preserve the cultural record worldwide, to promote the
development of and adherence to technical standards for pre-
servation techniques, and to encourage the national and inter-
national collection of and dissemination of bibliographic and
holdings information about preservation masters.�

Patricia A. Langelier, NCLA Councilor el

Afs UOT*#U@MON

Fall 1990"233







About the Authors...

Sharon L. Baker

Education: B.S., The Ohio State University; M.L.S.,
Kent State University; Ph.D., University of
Illinois.

Position: Associate Professor, School of Library
and Information Science, University of Iowa.

Karen S. Croneis

Education: B.S., The Ohio State University;
M.S.L.S., Case Western Reserve University.

Position: Head, Physics-Mathematics-Astronomy
Library, The University of Texas at Austin.

James J. Govern

Education: B.A., West Liberty State College;
M.S.L.S., University of Kentucky.

Position: Library Director, Stanly County Public
Library.

Patsy J. Hansel

Education: B.A., University of North Carolina at
Charlotte; M.A., Wake Forest; M.S.L.S., Uni-
versity of North Carolina at Chapel Hill.

Position: Director, Williamsburg, Virginia,
Regional Library.

Patricia M. Kelley

Education: B.A., University of Colorado; M.LS.,
University of Maryland; M.A., George Wash-
ington University.

Position: Assistant University Librarian for Pro-
grams and Services, George Washington
University.

Cynthia R. Levine

Education: A.B., University of North Carolina at
Chapel Hill; M.Ln., Emory University; M.S.,
North Carolina State University.

Position: Reference Librarian, North Carolina
State University Libraries.

Valerie W. Lovett

Education: B.A., Emory University; M.Ln., Emory
University.

Position: Assistant Library Director, Administra-
tive Services, Wake County Public Libraries.

Patrick J. Mullin
Education: B.A., University of Notre Dame; M.A.,
Purdue University; MS.L.S., University of

234"Fall 1990

Kentucky.

Position: Systems Librarian, University of North ;

Carolina at Chapel Hill; Interim Director,
Triangle Research Libraries Network.

Catherine Smith

Education: B.A., Carlow College; M.L.S., University
of Pittsburgh; M.A., Cleveland State Univer-
sity; Ph.D., University of Pittsburgh.

Position: Assistant Professor, Library and Infor- |
mation Studies, University of North Carolina
at Greensboro.

Sally Ann Strickler

Education: B.S., Mississippi University for Women;
M.S.L.S., Western Kentucky University; Ed.S.,
Western Kentucky University; Ed.D., Vander-
bilt University.

Position: Department Head, Department of
Library Public Services, Western Kentucky
University.

Rebecca Sue Taylor

Education: B.A., Bowling Green State University;
M.S.L.S., University of Kentucky.

Position: Coordinator of Youth Services, New
Hanover County Public Library.

Jerry A. Thrasher

Education: B.A., University of Alabama; MS.L.S.,
Florida State University.

Position: Library Director, Cumberland County
Public Library & Information Center.

Harry Tuchmayer

Education: B.A., M.L.S., M.A., University of Cali-
fornia, Los Angeles.

Position: Headquarters Librarian, New Hanover
County Public Library.

John E. Ulmschneider

Education: B.A., University of Virginia; M.S.LS.
University of North Carolina at Chapel Hill

Position: Assistant Director for Library System:
North Carolina State University Libraries.

Linda H. Y. Wang

Education: M.L.LS., University of Texas at Austin.

Position: Reference Librarian, University of South
Alabama Libraries.





NORTH CAROLINA LIBRARY ASSOCIATION

President

BARBARA BAKER

Durham Technical

Community College

1637 Lawson Street

Durham, NC 27703

Telephone: 919/598-9218
Fax: 919/595-9412

Vice-President/President Elect

JANET L. FREEMAN

Carlyle Campbell Library

Meredith College

3800 Hillsborough Street

Raleigh, NC 27607-5298

Telephone: 919/829-8531
Fax: 919/829-2830

Secretary
AMANDA BIBLE
Columbus County Library
407 N. Powell Blvd.
Whiteville, NC 28472
Telephone: 919/642-3116
Fax: 919/642-3839

ChildrenTs Services Section
PATRICIA SIEGFRIED
Public Library of Charlotte &
Mecklenburg County
310 North Tryon Street
Charlotte, NC 28202
Telephone: 704/336-6204

Fax: 704/336-2000

College and University Section
MARTHA RANSLEY
Jackson Library
UNC-G
Greensboro, NC 27412-5201
Telephone: 919/334-5245

Community and Junior College
SUSAN JANNEY
Charlotte AHEC Library
PO Box 32861
Charlotte, NC 28232
Telephone: 704/355-3129

Documents Section
' ROBERT GAINES
Jackson Library
University of NC at Greensboro
* Greensboro, NC 27412-5201
& Telephone: 919/334-5251

Library Administration and
Management Section
NANCY RAY
Southern Pines Public Library
180 SW Broad Street
Southern Pines, NC 28387
Telephone: 919/692-8235

1989-1991 EXECUTIVE BOARD

Treasurer

MICHAEL J. LACROIX

Ethel K. Smith Library

Wingate College

P. O. Box 217

Wingate, NC 28174-0217

Telephone: 704/233-8090
Fax: 704/233-8254

Directors
SYLVIA SPRINKLE-HAMLIN
Forsyth County Public Library
660 West Fifth Street
Winston-Salem, NC 27101
Telephone: 919/727-2556
Fax: 919/727-2549

H. DAVID HARRINGTON
Sales Representative
Britannica

512 Brook Street
Salisbury, NC 28144
Telephone: 704/633-0597

ALA Councilor (4 Year Term)
PATRICIA A. LANGELIER
Institute of Government
CB 3330 - Knapp Building, UNC-CH
Chapel Hill, NC 27599
Telephone: 919/966-4130

Fax: 919/962-0654

SECTION/ROUND TABLE CHAIRS

New Members Round Table
MELANIE COLLINS
Harnett County Public Library
PO Box 1149
Lillington, NC 27546
Telephone: 919/893-3446
Fax: 919/893-3001

North Carolina Association of
School Librarians
LAURA BENSON
High Point Public Schools
900 English Road
High Point, NC 27260
Telephone: 919/885-5161

North Carolina Library
Paraprofessional Association
ANN H. THIGPEN
Sampson-Clinton Public Library
217 Graham Street
Clinton, NC 28328
Telephone: 919/592-4153

Public Library Section
NANCY BATES
Davidson County Public Library
602 S. Main Street
Lexington, NC 27292
Telephone: 704/249-7011 ext. 295

Reference and Adult Services Section
JOHANNAH SHERRER
Williams R. Perkins Library
Duke University
Durham, NC 27706
Telephone: 919/648-2372

SELA Representative
JERRY THRASHER

Cumberland County Public Library

300 Maiden Lane

Fayetteville, NC 28301

Telephone: 919/483-1580
Fax: 919/483-8644

Editor, North Carolina Libraries
FRANCES BRADBURN
Joyner Library
East Carolina Library
Greenville, NC 27858-4353
Telephone: 919/757-6076
Fax: 919/757-6618

Past-President
PATSY J. HANSEL
Williamsburg Regional Library
515 Scotland Street
Williamsburg, VA 23185

Administrative Assistant (Ex Officio)

MARTHA FONVILLE

North Carolina Library Association
c/o State Library of North Carolina

109 East Jones Street

Raleigh, NC 27601-1023

Telephone: 919/839-6252
Fax: 919/839-6252

Resources and Technical Services
Section
DAVID GLEIM
Catalog Department, CB 3914
Davis Library, UNC-CH
Chapel Hill, NC 27599-3914
Telephone: 919/962-0153
Fax: 919/962-0484

Round Table on Ethnic/Minority

Concerns
RENEE STIFF

James E. Shepard Memorial Library

North Carolina Central University
1801 Fayetteville Street

Durham, NC 27707

Telephone: 919/560-6097

Round Table on Special Collections
MAURICE C. YORK
Joyner Library
East Carolina University
Greenville, NC 27858-4354
Telephone: 919/757-6617
Fax: 919/757-6618

Round Table on The Status of
Women in Librarianship
KAREN SEAWELL
Greensboro AHEC
1200 N. Elm Street
Greensboro, NC 27420
Telephone: 919/379-4483
Fax: 919/379-4328

Trustees Section
TERRI UNION
508 Cliffside Drive
Fayetteville, NC 28203
Telephone: 919/483-2195
Fax: 919/483-1091

Fall 1990"235





Editor
FRANCES BRYANT BRADBURN
Joyner Library
East Carolina University
Greenville, NC 27858
(919) 757-6076

Associate Editor
HOWARD F. McGINN
Division of State Library
109 East Jones Street
Raleigh, NC 27611
(919) 733-2570

Associate Editor
ROSE SIMON
Dale H. Gramley Library
Salem College
Winston-Salem, NC 27108
(919) 721-2649

Book Review Editor
ROBERT ANTHONY
CB#3930, Wilson Library
University of North Carolina
Chapel Hill, NC 27599
(919) 962-1172

Advertising Manager
HARRY TUCHMAYER

New Hanover County Public Library

201 Chestnut Street
Wilmington, NC 28401
(919) 341-4390

Editor, Tar Heel Libraries
JOHN WELCH
Division of State Library
109 East Jones Street
Raleigh, NC 27611

EDITORIAL STAFF

ChildrenTs Services

SATIA ORANGE

Forsyth County Public Library
660 West Fifth Street
Winston-Salem, NC 27101
(919) 727-2556

College and University

JINNIE Y. DAVIS

Planning and Development

D. H. Hill Library

North Carolina State University
Box 7111

Raleigh, NC 27695

(919) 737-3659

Community and Junior College

PAT RICHARDSON

Wake Technical Community College
9101 Fayetteville Road

Raleigh, NC 27603

(919) 772-0551

Documents
- LISA K. DALTON

Rockingham County Public Library
598 Pierce Street

Eden, NC 27288

(919) 623-3168

Junior Members Round Table

DOROTHY DAVIS HODDER

Public Services Librarian

New Hanover County Public Library
201 Chestnut Street

Wilmington, NC 28401

(919) 341-4390

North Carolina Library
Paraprofessional Association
JUDIE STODDARD
Onslow County Public Library
68 Doris Avenue East
Jacksonville, NC 28540
(919) 455-7350

Public Library
BOB RUSSELL
Elbert Ivey Memorial Library
420 Third Street NW
Hickory, NC 28601
(704) 322-2905

Reference/Adult Services
ILENE NELSON
William R. Perkins Library
Duke University
Durham, NC 27706
(919) 684-2373

Resources and Technical Services
GENE LEONARDI
Shepard Library
North Carolina Central University
Durham, NC 27707
(919) 560-6220

Round Table for Ethnic/Minority
Concerns
EUTHENA NEWMAN
North Carolina A & T University
F. D. Bluford Library
1601 E. Market Street
Greensboro, NC 27411
(919) 379-7782

(919) 733-2570 Round Table on the Status of Wome! ~

N. C. Association of School in Librarianship
Trustees Librarians ELIZABETH LANEY
MRS. ERNEST M. KNOTTS KATHERINE R. CAGLE CB#3360, 100 Manning Hall
Route 2, Box 505 R. J. Reynolds High School University of North Carolina

Albemarle, NC 28001
(704) 982-7434

Chapel Hill, NC 27599-3360

Winston-Salem, NC 27106
(919) 962-8361

(919) 727-2260

Address all correspondence to
Frances Bryant Bradburn, Editor
Joyner Library, East Carolina University
Greenville, NC 27868

North Carolina Libraries, published four times a year, is the official publication of the North Carolin~
Library Association. Membership dues include a subscription to North Carolina Libraries. Membershi
information may be obtained from the treasurer of NCLA.

Subscription rates are $32.00 per year, or $10.00 per issue, for domestic subscriptions; $50.00 per year,
or $15.00 per issue, for foreign subscriptions. Backfiles are maintained by the editor. Microfilm copies are
available through University Microfilms. North Carolina Libraries is indexed by Library Literature and
publishes its own annual index.

Editorial correspondence should be addressed to the editor; advertisement correspondence should
be addressed to the advertising manager. Articles are juried.

North Carolina Libraries is printed by Meridional Publications, Wake Forest, NC.

Issue deadlines are February 10, May 10, August 10, and November 10.

236"Fall 1990


Title
North Carolina Libraries, Vol. 48, no. 3
Description
North Carolina Libraries publishes article of interest to librarians in North Carolina and around the world. It is the official publication of the North Carolina Library Association and as such publishes the Official Minutes of the Executive Board and conference proceedings.
Date
1990
Original Format
magazines
Extent
16cm x 25cm
Local Identifier
Z671.N6 v. 48
Creator(s)
Subject(s)
Location of Original
Joyner NC Stacks
Rights
This item has been made available for use in research, teaching, and private study. Researchers are responsible for using these materials in accordance with Title 17 of the United States Code and any other applicable statutes. If you are the creator or copyright holder of this item and would like it removed, please contact us at als_digitalcollections@ecu.edu.
http://rightsstatements.org/vocab/InC-EDU/1.0/
Permalink
https://digital.lib.ecu.edu/27330
Preferred Citation
Cite this item
Content Notice

Public access is provided to these resources to preserve the historical record. The content represents the opinions and actions of their creators and the culture in which they were produced. Therefore, some materials may contain language and imagery that is outdated, offensive and/or harmful. The content does not reflect the opinions, values, or beliefs of ECU Libraries.

Contact Digital Collections

If you know something about this item or would like to request additional information, click here.


Comment on This Item

Complete the fields below to post a public comment about the material featured on this page. The email address you submit will not be displayed and would only be used to contact you with additional questions or comments.


*
*
*
Comment Policy