This value is always a multiple of 4,096, which is the page size that is used in Windows. As demand for virtual memory increases beyond the available RAM, the operating system adjusts how much of a process's virtual memory is in its Working Set to optimize available RAM usage and minimize paging.
Pergamon InformationProcessing& Management,Vol.31, No. 4, pp. 621--624,1995 ElsevierScienceLtd. Printedin GreatBritain BOOK REVIEWS Libraries and Life in a Changing World: The Metcalfe Years, 1920-1970. Paper from Australian Library History Forum V at the University of New South Wales, 6-7th November, 1992. W. B. RAYWARD. School of Information Library and Archive Studies, the University of New South Wales, Sydney (1993). ix+231 pp., $26.00. ISBN 0-7334-0014-0. In the face of the declining respect paid to library history in many parts of the world, it is highly refreshing to see its growing vitality in Australia. In this, the fifth collection of papers to appear since 1985 from the Library History Forum, the verve and excitement of the participants continue unabated. Indeed, what distinguishes this and the previous volumes in the series is the sense of enthusiasm they communicate of people involved in pioneering, research front activity. For the outsider, knowing little about Australian libraries and their history--or even about the general thrust of Australian history and culture--reading this volume and its predecessors has been a thought provoking experience prompting a range of reactions and questions. Why is Australian library history flourishing at this time? What has been the role of Boyd Rayward in this development? Was there scholarly library history before the appearance of the Library History Forum? Is there not a sufficient body of literature to justify a historiographic analysis of Australian library history and a monographic synthesis of the literature to date? To Australians, such questions may appear irrelevant or simplistic, but to someone from a great distance they pose themselves as inevitable reactions to this remarkable publishing phenomenon. This brings us inevitably to the consideration of volume V, the Metcalfe Years. To an overseas reader, it is extraordinary that one individual in the middle decades of this century can be credited with exerting such a decisive influence upon the library development of one country. Yet John W. Metcalfe (1901-1982) during the course of his professional career (1920-1970) played a crucial role in virtually all aspects of Australian librarianship: becoming head of the Public Library of New South Wales (Sydney), then of the Library of the University of New South Wales and first Director of its School of Librarianship, playing a crucial role in library commissions of enquiry, developing professional associations, and intellectually dominating Australian librarianship through his teaching and writing on indexing and subject cataloguing. These facets of his contribution are brought out in a series of articles of varying but generally good quality. In addition, there are several articles dealing with libraries, publishing, and collecting during the 1920-70 era that bear little direct relevance to Metcalfe, but contribute to understanding his context. If there is one general criticism to be made of this volume, it is the failure to explain--rather than describe--how Metcalfe came to achieve such a prominent role in Australian librarianship. Was it a necessary consequence of his being centred in Sydney and New South Wales? Was it because of his being the only strong-minded person with a willingness to state firm opinions? Or was it simply a factor of his being a man faced with women competitors who, according to rumours reaching even Canada but not discussed in this volume, were said to be lesbians? This volume, like its predecessors, is well worth reading, particularly for anyone interested in comparative librarianship. A very different model of libary development differentiates Australia from Canada, Britain, or the United States. Graduate School of Library and Information Studies McGill University Montreal, Canada PETER E McNALLY Managing Gigabytes. IAN H. WIT'FEN, ALISTAIR MOFFAT and TIMOTHY C. BELL. Van Nostrand Reinold, New York (1994). 429 pp., $54.95. ISBN 0-442-01863-0. Managing Gigabytes covers the full range of problems that arise when designing an Information Retrieval System (IRS) in a context in which computers can store massive (gigabytes or terabytes) amounts of information. Some of the topics included will already be familiar to most readers of {MI IPM)}, for example, the exponential growth of information (though the advent of the internet does provide a new slant 621 622 Book Reviews to this phenomenon) and indexing and retrieval of information. Here the standard methods are reviewed, with a strong emphasis on the vector models. It will provide a useful review for many readers; but most significantly, the presence of these sections on information retrieval, in the context of the remainder of the book, emphasizes the underlying coherence of the various components making up a realistic modem IRS. The questions raised here relate closely to topics dealt with in the remainder of the book, but not usually associated with the retrieval of information. The heart of the book deals with data compression, an area in which the authors have achieved a dominant position, and about which they are able to speak with authority and insight. It may at first appear ironic that the question of data compression should have become so important at a time when technologies are evolving that permit us to store new orders of magnitudes of data. But these technologies are the primary force driving research in this field, by permitting new classes of data to be stored. Databases of full text (rather than just index records to text) are the most obvious examples, but image and audio data especially consume immense amounts of storage. The intemet now makes available images that include examples taken from art museum catalogues, in full color; weather maps; and images of text. We can download music and voice data, and there is increasing interest in distributing 'moving' pictures. Such opportunities challenge even the capabilities of state-of-the-art optical storage media. But also of concern is the transmission of such data over communication channels that are already becoming overwhelmed. This book has as its scope all the issues posed by this revolution in information control. There are chapters, for example, describing the data compression standards that govern facsimile transmission and the dominant JPEG and MPEG standards; very interesting are sections on text images--particularly on partitioning the image of a page of a document into its component segments, allowing appropriate storage techniques to be used for each. A book, bringing together in a single volume, at a more or less introductory level, such a diversity of material pertaining to our being able to store whole multi-media libraries, would as such be very useful, both as a text and for profesional update. But some of the most interesting and valuable portions deal with the research interests of the authors. Two of the authors have made very important contributions to, and are strongly associated with, a form of compression known as arithmetic coding. It is not surprising that this is carefully and lucidly described. But, being semi-reformed arithmeticians, they also have allowed sections on LempeI-Ziv coding and on the older, but, as is becoming increasingly recognised, still competitive Huffman coding. The bias towards arithmetic coding remains, but the reader can obtain a good understanding of the range of methods in vogue today by reading this book. Also very interesting is the observation that massive amounts of data, say, text, poses problems not only in storage, but also in access. Traditionally, the inverted index has been the primary data structure used to access text. But creating and using an inverted index, itself a structure containing an amount of data commensurate with the size of the text itself, become very difficult when gigabytes and terabytes of data are to be controlled. This book discusses in some detail methods for implementing the inverted index, using computers of not-unreasonable power, within an acceptable amount of time. The methods used rely heavily on data compression, which very nicely unifies the Indeed, numerous library case studies and several collections of such studies have been published since the late 1980s; but this case study literature is generally idiosyncratic and not well coordinated as a body. Here, an overall structure for the work was devised and the individual chapters commissioned from experts. The result is a treatment of the subject that is relatively complete and coherent, while providing the specific fruits of practitioners' experience in particular cases. The editors have also made a conscious effort to cover the basic issues and principles associated with this practice. The contributors are all experienced library and systems professionals, with backgrounds in all types of libraries, as well as a couple of academics. There is a geographic focus on the UK but, as the editors point Comments are closed.
|
AuthorWrite something about yourself. No need to be fancy, just an overview. Archives
December 2022
Categories |