Library of Congress Workshop on Etexts Part 6

You’re reading novel Library of Congress Workshop on Etexts Part 6 online at LightNovelFree.com. Please use the follow button to get notification about the latest chapter next time when you visit LightNovelFree.com. Use F11 button to read novel in full-screen(PC only). Drop by anytime you want to read free – fast – latest novel. It’s great if you could leave a comment, share your opinion about the new chapters, new novel with others on the internet. We’ll do our best to bring you the finest, latest novel everyday. Enjoy!

SESSION III. DISTRIBUTION, NETWORKS, AND NETWORKING: OPTIONS FOR DISSEMINATION

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ ZICH * Issues pertaining to CD-ROMs * Options for publis.h.i.+ng in CD-ROM *

Robert ZICH, special a.s.sistant to the a.s.sociate librarian for special projects, Library of Congress, and moderator of this session, first noted the blessed but somewhat awkward circ.u.mstance of having four very distinguished people representing networks and networking or at least leaning in that direction, while lacking anyone to speak from the strongest possible background in CD-ROMs. ZICH expressed the hope that members of the audience would join the discussion. He stressed the subt.i.tle of this particular session, "Options for Dissemination," and, concerning CD-ROMs, the importance of determining when it would be wise to consider dissemination in CD-ROM versus networks. A shopping list of issues pertaining to CD-ROMs included: the grounds for selecting commercial publishers, and in-house publication where possible versus nonprofit or government publication. A similar list for networks included: determining when one should consider dissemination through a network, identifying the mechanisms or ent.i.ties that exist to place items on networks, identifying the pool of existing networks, determining how a producer would choose between networks, and identifying the elements of a business arrangement in a network.

Options for publis.h.i.+ng in CD-ROM: an outside publisher versus self-publication. If an outside publisher is used, it can be nonprofit, such as the Government Printing Office (GPO) or the National Technical Information Service (NTIS), in the case of government. The pros and cons a.s.sociated with employing an outside publisher are obvious. Among the pros, there is no trouble getting accepted. One pays the bill and, in effect, goes one's way. Among the cons, when one pays an outside publisher to perform the work, that publisher will perform the work it is obliged to do, but perhaps without the production expertise and skill in marketing and dissemination that some would seek. There is the body of commercial publishers that do possess that kind of expertise in distribution and marketing but that obviously are selective. In self-publication, one exercises full control, but then one must handle matters such as distribution and marketing. Such are some of the options for publis.h.i.+ng in the case of CD-ROM.

In the case of technical and design issues, which are also important, there are many matters which many at the Workshop already knew a good deal about: retrieval system requirements and costs, what to do about images, the various capabilities and platforms, the trade-offs between cost and performance, concerns about local-area networkability, interoperability, etc.

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ LYNCH * Creating networked information is different from using networks as an access or dissemination vehicle * Networked multimedia on a large scale does not yet work * Typical CD-ROM publication model a two-edged sword * Publis.h.i.+ng information on a CD-ROM in the present world of immature standards * Contrast between CD-ROM and network pricing *

Examples demonstrated earlier in the day as a set of insular information gems * Paramount need to link databases * Layering to become increasingly necessary * Project NEEDS and the issues of information reuse and active versus pa.s.sive use * X-Windows as a way of differentiating between network access and networked information * Barriers to the distribution of networked multimedia information * Need for good, real-time delivery protocols * The question of presentation integrity in client-server computing in the academic world * Recommendations for producing multimedia +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Clifford LYNCH, director, Library Automation, University of California, opened his talk with the general observation that networked information const.i.tuted a difficult and elusive topic because it is something just starting to develop and not yet fully understood. LYNCH contended that creating genuinely networked information was different from using networks as an access or dissemination vehicle and was more sophisticated and more subtle. He invited the members of the audience to extrapolate, from what they heard about the preceding demonstration projects, to what sort of a world of electronics information--scholarly, archival, cultural, etc.--they wished to end up with ten or fifteen years from now.

LYNCH suggested that to extrapolate directly from these projects would produce unpleasant results.

Putting the issue of CD-ROM in perspective before getting into generalities on networked information, LYNCH observed that those engaged in multimedia today who wish to s.h.i.+p a product, so to say, probably do not have much choice except to use CD-ROM: networked multimedia on a large scale basically does not yet work because the technology does not exist. For example, anybody who has tried moving images around over the Internet knows that this is an exciting touch-and-go process, a fascinating and fertile area for experimentation, research, and development, but not something that one can become deeply enthusiastic about committing to production systems at this time.

This situation will change, LYNCH said. He differentiated CD-ROM from the practices that have been followed up to now in distributing data on CD-ROM. For LYNCH the problem with CD-ROM is not its portability or its slowness but the two-edged sword of having the retrieval application and the user interface inextricably bound up with the data, which is the typical CD-ROM publication model. It is not a case of publis.h.i.+ng data but of distributing a typically stand-alone, typically closed system, all--software, user interface, and data--on a little disk. Hence, all the between-disk navigational issues as well as the impossibility in most cases of integrating data on one disk with that on another. Most CD-ROM retrieval software does not network very gracefully at present. However, in the present world of immature standards and lack of understanding of what network information is or what the ground rules are for creating or using it, publis.h.i.+ng information on a CD-ROM does add value in a very real sense.

LYNCH drew a contrast between CD-ROM and network pricing and in doing so highlighted something bizarre in information pricing. A large inst.i.tution such as the University of California has vendors who will offer to sell information on CD-ROM for a price per year in four digits, but for the same data (e.g., an abstracting and indexing database) on magnetic tape, regardless of how many people may use it concurrently, will quote a price in six digits.

What is packaged with the CD-ROM in one sense adds value--a complete access system, not just raw, unrefined information--although it is not generally perceived that way. This is because the access software, although it adds value, is viewed by some people, particularly in the university environment where there is a very heavy commitment to networking, as being developed in the wrong direction.

Given that context, LYNCH described the examples demonstrated as a set of insular information gems--Perseus, for example, offers nicely linked information, but would be very difficult to integrate with other databases, that is, to link together seamlessly with other source files from other sources. It resembles an island, and in this respect is similar to numerous stand-alone projects that are based on videodiscs, that is, on the single-workstation concept.

As scholars.h.i.+p evolves in a network environment, the paramount need will be to link databases. We must link personal databases to public databases, to group databases, in fairly seamless ways--which is extremely difficult in the environments under discussion with copies of databases proliferating all over the place.

The notion of layering also struck LYNCH as lurking in several of the projects demonstrated. Several databases in a sense const.i.tute information archives without a significant amount of navigation built in.

Educators, critics, and others will want a layered structure--one that defines or links paths through the layers to allow users to reach specific points. In LYNCH's view, layering will become increasingly necessary, and not just within a single resource but across resources (e.g., tracing mythology and cultural themes across several cla.s.sics databases as well as a database of Renaissance culture). This ability to organize resources, to build things out of multiple other things on the network or select pieces of it, represented for LYNCH one of the key aspects of network information.

Contending that information reuse const.i.tuted another significant issue, LYNCH commended to the audience's attention Project NEEDS (i.e., National Engineering Education Delivery System). This project's objective is to produce a database of engineering courseware as well as the components that can be used to develop new courseware. In a number of the existing applications, LYNCH said, the issue of reuse (how much one can take apart and reuse in other applications) was not being well considered. He also raised the issue of active versus pa.s.sive use, one aspect of which is how much information will be manipulated locally by users. Most people, he argued, may do a little browsing and then will wish to print. LYNCH was uncertain how these resources would be used by the vast majority of users in the network environment.

LYNCH next said a few words about X-Windows as a way of differentiating between network access and networked information. A number of the applications demonstrated at the Workshop could be rewritten to use X across the network, so that one could run them from any X-capable device- -a workstation, an X terminal--and transact with a database across the network. Although this opens up access a little, a.s.suming one has enough network to handle it, it does not provide an interface to develop a program that conveniently integrates information from multiple databases.

X is a viewing technology that has limits. In a real sense, it is just a graphical version of remote log-in across the network. X-type applications represent only one step in the progression towards real access.

LYNCH next discussed barriers to the distribution of networked multimedia information. The heart of the problem is a lack of standards to provide the ability for computers to talk to each other, retrieve information, and shuffle it around fairly casually. At the moment, little progress is being made on standards for networked information; for example, present standards do not cover images, digital voice, and digital video. A useful tool kit of exchange formats for basic texts is only now being a.s.sembled. The synchronization of content streams (i.e., synchronizing a voice track to a video track, establis.h.i.+ng temporal relations between different components in a multimedia object) const.i.tutes another issue for networked multimedia that is just beginning to receive attention.

Underlying network protocols also need some work; good, real-time delivery protocols on the Internet do not yet exist. In LYNCH's view, highly important in this context is the notion of networked digital object IDs, the ability of one object on the network to point to another object (or component thereof) on the network. Serious bandwidth issues also exist. LYNCH was uncertain if billion-bit-per-second networks would prove sufficient if numerous people ran video in parallel.

LYNCH concluded by offering an issue for database creators to consider, as well as several comments about what might const.i.tute good trial multimedia experiments. In a networked information world the database builder or service builder (publisher) does not exercise the same extensive control over the integrity of the presentation; strange programs "munge" with one's data before the user sees it. Serious thought must be given to what guarantees integrity of presentation. Part of that is related to where one draws the boundaries around a networked information service. This question of presentation integrity in client-server computing has not been stressed enough in the academic world, LYNCH argued, though commercial service providers deal with it regularly.

Concerning multimedia, LYNCH observed that good multimedia at the moment is hideously expensive to produce. He recommended producing multimedia with either very high sale value, or multimedia with a very long life span, or multimedia that will have a very broad usage base and whose costs therefore can be amortized among large numbers of users. In this connection, historical and humanistically oriented material may be a good place to start, because it tends to have a longer life span than much of the scientific material, as well as a wider user base. LYNCH noted, for example, that American Memory fits many of the criteria outlined. He remarked the extensive discussion about bringing the Internet or the National Research and Education Network (NREN) into the K-12 environment as a way of helping the American educational system.

LYNCH closed by noting that the kinds of applications demonstrated struck him as excellent justifications of broad-scale networking for K-12, but that at this time no "killer" application exists to mobilize the K-12 community to obtain connectivity.

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ DISCUSSION * Dearth of genuinely interesting applications on the network a slow-changing situation * The issue of the integrity of presentation in a networked environment * Several reasons why CD-ROM software does not network *

During the discussion period that followed LYNCH's presentation, several additional points were made.

LYNCH reiterated even more strongly his contention that, historically, once one goes outside high-end science and the group of those who need access to supercomputers, there is a great dearth of genuinely interesting applications on the network. He saw this situation changing slowly, with some of the scientific databases and scholarly discussion groups and electronic journals coming on as well as with the availability of Wide Area Information Servers (WAIS) and some of the databases that are being mounted there. However, many of those things do not seem to have piqued great popular interest. For instance, most high school students of LYNCH's acquaintance would not qualify as devotees of serious molecular biology.

Concerning the issue of the integrity of presentation, LYNCH believed that a couple of information providers have laid down the law at least on certain things. For example, his recollection was that the National Library of Medicine feels strongly that one needs to employ the identifier field if he or she is to mount a database commercially. The problem with a real networked environment is that one does not know who is reformatting and reprocessing one's data when one enters a client server mode. It becomes anybody's guess, for example, if the network uses a Z39.50 server, or what clients are doing with one's data. A data provider can say that his contract will only permit clients to have access to his data after he vets them and their presentation and makes certain it suits him. But LYNCH held out little expectation that the network marketplace would evolve in that way, because it required too much prior negotiation.

CD-ROM software does not network for a variety of reasons, LYNCH said.

He speculated that CD-ROM publishers are not eager to have their products really hook into wide area networks, because they fear it will make their data suppliers nervous. Moreover, until relatively recently, one had to be rather adroit to run a full TCP/IP stack plus applications on a PC-size machine, whereas nowadays it is becoming easier as PCs grow bigger and faster. LYNCH also speculated that software providers had not heard from their customers until the last year or so, or had not heard from enough of their customers.

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ BESSER * Implications of disseminating images on the network; planning the distribution of multimedia doc.u.ments poses two critical implementation problems * Layered approach represents the way to deal with users' capabilities * Problems in platform design; file size and its implications for networking * Transmission of megabyte size images impractical * Compression and decompression at the user's end * Promising trends for compression * A disadvantage of using X-Windows * A project at the Smithsonian that mounts images on several networks *

Howard BESSER, School of Library and Information Science, University of Pittsburgh, spoke primarily about multimedia, focusing on images and the broad implications of disseminating them on the network. He argued that planning the distribution of multimedia doc.u.ments posed two critical implementation problems, which he framed in the form of two questions: 1) What platform will one use and what hardware and software will users have for viewing of the material? and 2) How can one deliver a sufficiently robust set of information in an accessible format in a reasonable amount of time? Depending on whether network or CD-ROM is the medium used, this question raises different issues of storage, compression, and transmission.

Concerning the design of platforms (e.g., sound, gray scale, simple color, etc.) and the various capabilities users may have, BESSER maintained that a layered approach was the way to deal with users'

capabilities. A result would be that users with less powerful workstations would simply have less functionality. He urged members of the audience to advocate standards and accompanying software that handle layered functionality across a wide variety of platforms.

BESSER also addressed problems in platform design, namely, deciding how large a machine to design for situations when the largest number of users have the lowest level of the machine, and one desires higher functionality. BESSER then proceeded to the question of file size and its implications for networking. He discussed still images in the main.

For example, a digital color image that fills the screen of a standard mega-pel workstation (Sun or Next) will require one megabyte of storage for an eight-bit image or three megabytes of storage for a true color or twenty-four-bit image. Lossless compression algorithms (that is, computational procedures in which no data is lost in the process of compressing [and decompressing] an image--the exact bit-representation is maintained) might bring storage down to a third of a megabyte per image, but not much further than that. The question of size makes it difficult to fit an appropriately sized set of these images on a single disk or to transmit them quickly enough on a network.

With these full screen mega-pel images that const.i.tute a third of a megabyte, one gets 1,000-3,000 full-screen images on a one-gigabyte disk; a standard CD-ROM represents approximately 60 percent of that. Storing images the size of a PC screen (just 8 bit color) increases storage capacity to 4,000-12,000 images per gigabyte; 60 percent of that gives one the size of a CD-ROM, which in turn creates a major problem. One cannot have full-screen, full-color images with lossless compression; one must compress them or use a lower resolution. For megabyte-size images, anything slower than a T-1 speed is impractical. For example, on a fifty-six-kilobaud line, it takes three minutes to transfer a one-megabyte file, if it is not compressed; and this speed a.s.sumes ideal circ.u.mstances (no other user contending for network bandwidth). Thus, questions of disk access, remote display, and current telephone connection speed make transmission of megabyte-size images impractical.

BESSER then discussed ways to deal with these large images, for example, compression and decompression at the user's end. In this connection, the issues of how much one is willing to lose in the compression process and what image quality one needs in the first place are unknown. But what is known is that compression entails some loss of data. BESSER urged that more studies be conducted on image quality in different situations, for example, what kind of images are needed for what kind of disciplines, and what kind of image quality is needed for a browsing tool, an intermediate viewing tool, and archiving.

BESSER remarked two promising trends for compression: from a technical perspective, algorithms that use what is called subjective redundancy employ principles from visual psycho-physics to identify and remove information from the image that the human eye cannot perceive; from an interchange and interoperability perspective, the JPEG (i.e., Joint Photographic Experts Group, an ISO standard) compression algorithms also offer promise. These issues of compression and decompression, BESSER argued, resembled those raised earlier concerning the design of different platforms. Gauging the capabilities of potential users const.i.tutes a primary goal. BESSER advocated layering or separating the images from the applications that retrieve and display them, to avoid tying them to particular software.

BESSER detailed several lessons learned from his work at Berkeley with Imagequery, especially the advantages and disadvantages of using X-Windows. In the latter category, for example, retrieval is tied directly to one's data, an intolerable situation in the long run on a networked system. Finally, BESSER described a project of Jim Wallace at the Smithsonian Inst.i.tution, who is mounting images in a extremely rudimentary way on the Compuserv and Genie networks and is preparing to mount them on America On Line. Although the average user takes over thirty minutes to download these images (a.s.suming a fairly fast modem), nevertheless, images have been downloaded 25,000 times.

BESSER concluded his talk with several comments on the business arrangement between the Smithsonian and Compuserv. He contended that not enough is known concerning the value of images.

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ DISCUSSION * Creating digitized photographic collections nearly impossible except with large organizations like museums * Need for study to determine quality of images users will tolerate *

During the brief exchange between LESK and BESSER that followed, several clarifications emerged.

LESK argued that the photographers were far ahead of BESSER: It is almost impossible to create such digitized photographic collections except with large organizations like museums, because all the photographic agencies have been going crazy about this and will not sign licensing agreements on any sort of reasonable terms. LESK had heard that National Geographic, for example, had tried to buy the right to use some image in some kind of educational production for $100 per image, but the photographers will not touch it. They want accounting and payment for each use, which cannot be accomplished within the system. BESSER responded that a consortium of photographers, headed by a former National Geographic photographer, had started a.s.sembling its own collection of electronic reproductions of images, with the money going back to the cooperative.

LESK contended that BESSER was unnecessarily pessimistic about multimedia images, because people are accustomed to low-quality images, particularly from video. BESSER urged the launching of a study to determine what users would tolerate, what they would feel comfortable with, and what absolutely is the highest quality they would ever need. Conceding that he had adopted a dire tone in order to arouse people about the issue, BESSER closed on a sanguine note by saying that he would not be in this business if he did not think that things could be accomplished.

Library of Congress Workshop on Etexts Part 6

You're reading novel Library of Congress Workshop on Etexts Part 6 online at LightNovelFree.com. You can use the follow function to bookmark your favorite novel ( Only for registered users ). If you find any errors ( broken links, can't load photos, etc.. ), Please let us know so we can fix it as soon as possible. And when you start a conversation or debate about a certain topic with other people, please do not offend them just because you don't like their opinions.


Library of Congress Workshop on Etexts Part 6 summary

You're reading Library of Congress Workshop on Etexts Part 6. This novel has been translated by Updating. Author: Library of Congress already has 565 views.

It's great if you read and follow any novel on our website. We promise you that we'll bring you the latest, hottest novel everyday and FREE.

LightNovelFree.com is a most smartest website for reading novel online, it can automatic resize images to fit your pc screen, even on your mobile. Experience now by using your smartphone and access to LightNovelFree.com