A Bit About CDRW Stuffcooline.gif (479 bytes)

CD-R drive stuff....

The market has been awash with tons of cheap, 'hundred'x speed drives, bulk and packed, for the last years. CD-ROM technology is not a piece of cake and two things to look for are the speed and the formats supported by the drive.

1) The speed of a CD-ROM is specified in the format Wx, where
W is an integer (usually, but not always, even), for example 2x,
3x, 4x, 8x, 12x, etc.. The single speed (of the 80's) read data at an average rate of 140Kb/sec. A  2x drive should read 280Kb/sec.. Speed has nothing to do with the interface technology (IDE, SCSI, etc.).

2) A CD-ROM may contain not only DATA in format ISO-9660 (or the very similar High Sierra), but can contain CD-DA (digital audio) tracks, and other formats like CD-XA. CD-DA. The  cooked is offered in most CD-ROM units, while raw mode reading is used by some programs to copy a CD-DA
track as a WAV, VOC, or any other sound format into a filesystem. To check your CD-ROM drive for raw support, you can use the int 21h (under DOS) service ax = 4402h[2].

NOTE: The IDE or SCSI interface does not matter very much, since it does not vary the reading speed of the device itself.

Want to put your Web on a CD, ok:

Now you can place your HTML pages on a diskette, along with Browse and View, and the World Wide Web can be delivered on a diskette to anyone! Yes, an offline, standalone browser!

As easy as 1, 2, 3...:
To create your own royalty-free, browser on a disk...all you need to do is place the following files on a blank, formatted diskette (or CD recordable)...

BROWSER.EXE
BROWSER.MY
BROWSE.DLL

...along with all your HTML, .GIF. WAV, .MID and .AVI files.

It's that simple! You now can distribute your Web pages on a diskette! Your end user (audience) does NOT need Netscape Navigator, or Internet Explorer, or even access to the Web. Browse and View is all they need to experience the Web on a disk!

Some of Browse and View's features:

A powerful offline HTML browser, runs from a floppy disk or CD-ROM.

Supports all HTML version 2 tags and most HTML version 3 tags.

Compact - only 300kb when compressed with PKZIP! 700kb when uncompressed.

Launch other programs and associate hyperlinks with helper applications. Browse and View can launch Media Player by clicking on .AVI and .WAV and .MID files, or can execute .BAT batch files and run .EXEs.

View, print, and search HTML files.

Can I distribute a web site on a CD-ROM:

You need to include the content and a browser on the CD. Some products that might be helpful are:

GEAR WebGrabber:
http://www.elektroson.com/PRODUCTS/GEAR_WEBGRABBER/
WINDOWS/WG_W_ADPAGE.HTM

Browse and View:
http://www.pc-shareware.com/browser.htm

Tips for Recording:

To increase the CD recording success rate, connect a CD device which is faster in transfer mode as the master drive and the slower one as slave drive. Since I own a Acer Peripherals' CRW6206A/V, say It's running in PIO Mode 4. Therefore if you have a CD-ROM drive only running in PIO Mode 3, then we suggest you connect the CD-ROM drive as the secondary slave drive and CRW6206A/V as secondary master drive. However if both drives are running in PIO Mode, we recommend configuring your CD-ROM as secondary master and CRW6206A as secondary slave drive in your system.

A successful CD recording process can not have any interruption. Therefore a stable system is a basic requirement. I recommend to use an extra hard drive set up just for CD recording. This HDD should have only WIN 95/WIN NT, the CD recording software, and nothing else. Boot up with this HDD and proceed with the CD recording. I also recommend a simple environment to achieve a smoother recording process.

If you do not have another HDD, you may consider the followings :

(A) Before you make a CD recording, please close all memory resident programs which include Anti-Virus programs, Screen Saver, System Agent, and Power Management etc. Auto-insert Notification should also be turned off. Triggering of any of these event may result in the interruption of the CD recording session, which consequently ruins the CD disc.

(B) During the CD recording, please do not run any other application software. This will use CPU resource, and can affect the CD recording. The action of maximizing or minimizing windows is also not recommended.

(C) Defragment the HDD before any CD recording. Defragmentation of HDD reallocates the files so that file clusters are placed togethere. It is important if you write files from HDD to CD because this action will save time and reduce the risk when Easy CD Creator is trying to search for the files you want to write to the CD disc.

(D) Please do not connect too many internal devices to the computer system. Heavy load on the system power supply may cause unstable power source, which could affect the CD recording.

How do I clean my CD recorder?

In general, you don't. The only reason you'd need to clean a recorder or (for that matter) a CD-ROM drive is if you went and stuck your finger on the lens. Cleaning kits and well-intentioned Q-tips are unnecessary and potentially dangerous.

If you have an overwhelming desire to clear the dust out of your recorder, and can't or don't want to send it to a service center, use gentle(!) bursts of compressed air.

Common CD Terms and Standards:
Enhanced CD (CD Plus): This recently standardized format includes data and audio tracks. Audio is recorded in the first session; data is recorded in the second. The data track cannot be accessed by conventional CD players.

Green Book: This standard defines the specification for Compact Disc Interactive (CD-I), a format that stores data, graphics, still video, full-motion video, and audio.

Hybrid: Hybrid is a special CD-ROM format that can be played faithfully on PC and Macintosh platforms.

ISO 9660: This international standard describes the file structure for computer files on CDs. ISO 9660 is platform-independent, so discs can be read on PC, Macintosh, and Unix file systems.

Joliet File System: Microsoft created this specification based on the ISO 9660 standard. Joliet supports long filenames and directory names and is compatible with Windows 95.

Mixed Mode: Mixed mode has data and audio tracks. Data is stored in the first session, and audio is stored in the remainder. Be careful not to play track one on a conventional CD player--you may blow your speakers.

Orange Book Part One: This standard describes Compact Disc Magneto Optical (CD-MO). Data can be written, erased, and then rewritten. CD-MO discs are incompatible with CD-Rs.

Orange Book Park Two: This standard describes Compact Disc Write Once (CD-WO), also referred to as CD-R. Discs can be written to, but data cannot be erased.

Red Book: This specification is for CD Digital Audio. Regular music CDs are considered Red Book discs.

White Book: White Book is for video CDs. Video CDs combine full-motion video and audio, typically compressed to the MPEG-1 standard.

Yellow Book Mode 1: This specific sector layout can store computer data, but doesn't necessitate high-level multimedia capabilities.

Yellow Book Mode 2: In this extension of Mode 1, more data can be stored at the cost of fewer error-correction capabilities. Mode 2 is used with Photo CD and CD-ROM XA discs.

Creating a Bootable CD

1.Create a boot DOS diskette with FORMAT A:/S, then copy the CD-rom device driver, msCDex.exe, sys.com and xcopy.exe (see note A) 2) create a simple config.sys with the line:

lastdrive=z
device=my_CDrom_driver.sys /d:restore

Also create a simple autoexec.bat with MSCDEX.exe /d:restore /l:z and copy it to the boot disk.

2.Launch Easy CD Creator.

3.Add all of your hard drive's contents to the Data CD layout page.

4.Select File | CD Layout Properties | Data Settings.

5.Check "Bootable CD."

6.Select "ISO 9660" and click on the Properties button. Select "Any MS-DOS 8+3 name."

7.Proceed with creating the CD. You will be asked to insert a bootable diskette; insert the disk you created in steps 1&2.

NOTE A: If you wish, you can add: the keyboard files you need for your country (copy the same entries you find in the C drive Config.sys and Autoexec.bat without any paths on the new diskette Config.sys and Autoexec.bat, and the requested files); you might also want to include Fdisk.exe and Format.com to clear up or modify the partition (especially useful after a virus attack).

If you're using SCSI CD drivers, you will need the SCSI driver too and the line in the config.sys for the SCSI controller device driver (note that not all SCSI-controllers and motherboards can be used to boot from CD-ROM).

Restoring from a Bootable CD

To use the bootable CD, you need a motherboard that support booting from a CD-ROM drive; usually you can find this setting in the motherboard's BIOS under Boot Options. If the boot options include "CDROM", then you can do it. Be sure to set the CD-ROM choice as the first in the sequence (typical entries might be CDROM, A, C).

1.Select the CD boot option in the motherboard BIOS, as described above, insert the CD, and reboot the system. Now it will boot from the boot image on the CD, which will be seen as drive A: while the real floppy disk drive is seen as B:.

2.From A: type SYS C: This will transfer the boot files to the hard disk.

3.From C: type A:\\\\XCOPY.EXE Z:\\\\*.* /S C:

Your whole drive is restored.

NOTE FOR WINDOWS 3.1 USERS:
To avoid problems with the restore function, it's a good idea to disable the swap file before burning the bootable CD. To do this, just go to the Control Panel | 386 Enhanced | Virtual memory | Change and set it to "None", then reboot before burning in order to have Windows remove all the references to the swap file. After the burn, you can set up again the swap file as usual, and when restoring the system from a CD you will need to do this again.

blue_line.jpg (2430 bytes)

Doculabs Test Report:

Compatibility of CD-R Media, Readers, and Writers Prepared for National Media Lab November 30, 1996 Amended June 30, 1997

“We put technology to the test!”™ Table of Contents Executive Summary    3 Introduction    4 Test Methodology    5 Test Background    5 Test Product Listing     5 Test Approach    7 Test Environment     8 Detailed Test Process    9 Test Results     11 Prior to Aging    11 After Aging    12 Further Research    13 Appendix A: Technical Advisory Group     14 Appendix B: Data Points Collected    15 Appendix C: CDCATS Parameter Definitions    16 Appendix D: Detailed Results  18 Appendix E: CD-R Aging Procedure    19 Executive Summary Doculabs and SIGCAT performed a comprehensive study of CD-R media, writers and readers.

This report highlights the results, provides initial analysis, and offers suggestions for further research. Questions about this study should be directed to Dhaval Joshi, Richard Medina or James Watson at Doculabs (312-433-7793). Overall 168 discs (eight types of media) were burnt on seven writers (recorders) and then read using eight readers. The discs were then put through an accelerated aging process at the National Media Lab, and the reader tests were re-executed on the aged media. The results of the testing can be summarized into two groups.

Prior to Aging: There were no incompatibilities between any of the media/writer/reader combinations when manually tested with students reading and executing files on the discs randomly. Thus users of CD-R equipment are likely to succeed when using the media, writers, and readers we tested. The analyzers indicated possible low level problems, though it was not significant enough to render the discs unreadable. However, manufacturers of CD-R equipment should be concerned with possible “out of spec” scenarios that may cause problems in the future.

After Aging: Three of the eight media types were completely unreadable after the aging process. Users of CD-R are well advised to determine what application the CD-R media is being used, whether long-term archival or short-term storage, and then select the appropriate media. In all, we collected a total of 10,248 data points during the test. For a breakdown of the data points collected, refer to Appendix B. Even so, the results should be considered preliminary -- though many questions have been answered in this test, just as many new questions have been raised. Introduction CD-R media has become one of the most popular storage media technologies in the industry today. However, CD-R technology is not without problems. There is considerable debate over the compatibility between different brands of CD-R media, CD-R readers, and CD-R writers. Though anecdotal evidence exists that suggests the presence of some incompatibilities across brands, no published research or test results were available to prospective users of CD-R technology. The reports of problems, however, had the potential to be detrimental to both users and manufacturers. With this in mind, Doculabs, an independent research and testing consultancy (with the assistance of the University of Illinois at Chicago), was commissioned by the National Media Lab to conduct a controlled compatibility and aging test. This test was designed to identify incompatibilities between different brands of CD-R media, writers and readers as well as the effect of aging.

This study is one of the first systematic tests of compatibility between leading media, readers, and writers not performed by or funded by a manufacturer(s). The study was a collaborative effort between Doculabs, the University of Illinois at Chicago, and the SIGCAT Foundation, a non-profit organization that educates users about CD-related technology. SIGCAT established a panel of experts to provide advice on the testing methodology, worked with Doculabs to develop the testing methodology, and coordinated the loan of equipment to be used in the test. Doculabs conducted the testing, analyzed the results, and monitored the heat stressing of the discs conducted by the National Media Lab. The report was prepared jointly by SIGCAT and Doculabs. Test Methodology This section provides information on the test background, the listing of products tested, the test approach, the test environment, and the detailed test process. Test Background Before testing began, SIGCAT assembled a Technical Advisory Group (TAG - see Appendix A) of independent industry and government experts to review and advise SIGCAT and Doculabs. This group helped identify the brands of CD-R media and writers to test, provided guidance on the types of problems typically encountered in testing, and suggested specific test procedures. In parallel, a literature search was conducted to identify similar studies and assess their results. Once the specific brands to be tested were identified, SIGCAT contacted the vendors and arranged for product to be shipped to Doculabs test facility in Chicago.

This equipment included the CD writers and CD-ROM drives. Media was purchased independently on the open market. All vendors voluntarily agreed to participate in the study. Representatives from these vendor organizations formed the Industry Advisory Group. Test Product Listing This section lists the CD-R media, writers, and readers Doculabs used in this study. Writers Doculabs tested five units marketed by the original manufacturer. Two units were relabeled units, including the HP and Pinnacle.

Writer    Procurement Method HP 2x     Loaned - HP Yamaha 2x    Loaned - Yamaha Yamaha 4x     Loaned - Yamaha Philips 2x    Loaned - Philips Sony 2x     Loaned - Sony Ricoh 2x    Loaned - Ricoh Pinnacle 2x     Loaned - SIGCAT Table 1 - CD-R Writers Used in the Testing Media The media reflect all the originally manufactured brands on the market at the time our test was scheduled to begin, except Hewlett-Packard's which is relabeled Mitsui.

Media    Procurement Method HP     Bought Kodak    Bought Mitsui    Bought Ricoh *    Bought TDK    Bought Taiyo Yuden     Bought Avery Labeled    Bought Verbatim  Bought Table 2 - CD-R Media Used in the Testing * Purchased directly from Ricoh (channels were not in place to acquire the media from other distributors).

Readers The CD-ROM drives represented a wide range of units, from a 1x drive that was several years old to an 8x just introduced by Plextor.

Readers    Procurement Method Philips 6x      Loaned - Philips Plextor 8x    Loaned - Plextor Mitsumi 4x    Loaned - CD-ROM Inc. Sony 4x    Loaned - Sony Toshiba 2x    Loaned - Doculabs Sun Moon Star 4x (Panasonic)     Loaned - SIGCAT Matshita 1x    Loaned - SIGCAT NEC 6x     Loaned - NEC Table 3 - CD-R Readers Used in the Testing Test Approach At a high level, the test consisted of the following steps: .Using each writer, create three CDs of each media type.For each CD, perform read/execute testing, data comparison testing, and recording error testing on each reader.Submit CDs to National Media Lab for accelerated aging.For each CD, perform read/execute testing, data comparison testing, and recording error testing a second time on each reader The data burned on each CD consisted of a variety of file types and sizes, including executables, text, graphics, audio, and video files. These files represent a typical mix of file types that users might burn and access on recorded discs. Executable files offer a good opportunity to verify recording accuracy, because the files will not operate properly if there are recording errors. In addition, the test data include large numbers of small files and directories several levels deep, which were designed to challenge the recording process for large hierarchies. Before beginning the official tests, Doculabs made a number of test discs in order to verify that the recording systems were functioning properly and that the test data would fill the discs (thus recording on the sometimes less-reliable outer tracks).

For the recording error testing, Doculabs obtained test equipment and utilities from Audio Development (CD CATS), Clover Systems (QA 101), and Corel (Xcomp). In cooperation with the producers of the test equipment, Doculabs developed a suite of proposed tests and standards based on commonly used performance measures. Doculabs sent this list to the TAG for review (see Appendix C for a listing of measurements). It should be noted that presently no official ANSI standard or set of tests for CD-Recordable discs exists. Test Environment As many factors as possible were held constant in order to test the independent variables most effectively.

This section describes the components of the test environment. Hardware Gateway 486 DX2/66 32 Mb RAM Western Digital Cavier 540 MB IDE drive Adaptec 1542 cp External GB SCSI hard drive (Seagate) Dell Pentium 100 32 Mb RAM 256 k pipeline bust cache PCI bus structure Western Digital Cavier 1.6 GB EIDE drive Adaptec 1542 cp S3 Based Video Card ACER Pentium 75 Motherboard 16 mob RAM PCI bus structure Adapter 1542 cp 540 Mb Maxtor Hard Drive ATI Mach 64 PCI Video Card 256 K Cache Software Premastering Software Corel CD Creator 2.0 (now Adaptec) for Windows 95 - supports all brands of writers we tested Detailed Test Process The following figure identifies the detailed process that Doculabs followed in conducting the compatibility testing.

Figure 1 - Detailed Test Process The detailed testing process was conducted as follows: .All CD media were inspected for any visible errors. Additionally, the current temperature and humidity in the lab were recorded. .Test data was burned onto the CDs. For each writer, we created three separate CDs on each different brand of media, for a total of 168 discs (3 pieces of media per brand, 8 brands of media, and 7 writers). .Each CD was visually inspected for imperfections after burning. .A read/execute test was performed on each CD using each of the readers. This included: Reading a sample of the files using their native applications Running executables to see if they operated properly A total of 18 manual tasks were conducted on each disc .A data comparison test was performed on each CD using Corel’s “Xcomp” utility. This is a bit-level comparison that ensures all the data burnt onto the CD is identical to the source files. .A test was conducted on each CD using the CD CATS machine and the Clover system to test for 12 types of recording errors. Of the three CDs of each media brand created on each writer, two were sent to the National Media Lab for aging .112 CDs were sent to the National Media Lab for an accelerated aging stress test. After simulating approximately 100 years of aging (750 hours), the CDs were returned to Doculabs. .Each CD was visually inspected for imperfections. .A read/execute test was performed again on each CD using each of the readers. This included: Reading a sample of the files using their native applications Running executables to see if they operated properly A total of 18 manual tasks were conducted on each disc .A data comparison test was performed on each CD using Corel’s “Xcomp” utility. This is a bit-level comparison that ensures all the data burnt onto the CD is identical to the source files. .A test was conducted on each CD using the CD CATS machine to test for 12 types of recording errors. Test Results The test results are divided into two sections: prior to aging and after aging. Within each section, the analysis discusses conclusions about media, writers, and readers if appropriate. The test results for media are also broken down by either manual data collection (via an operator physically reading each piece of media) or automatic data collection (with the CD analyzers or the Xcomp utility). The test parameters used in the analysis are those agreed to by the TAG. For the reading/executing portion of the test, a total of 18 manual tasks were performed on each CD, including reading files and running executables. If any of these manual tasks was unsuccessful, the disc was scored as failing. Additionally, the automated portion of the test used CD analyzers to look for 12 specific error types. Of these 12 failure types, we looked most closely at 4 specific error types: PushPull, I11, BLER, and Windows Margin. For a complete list of these parameters, refer to Appendix C. Prior to Aging Media In the manual tests, of the eight media types burnt on all of the writers, all were successfully read across all readers. This is significant because it indicates that the average users with current technology can expect to be successful with any other combination of CD-R equipment and media that we tested. In the automated tests, several errors were noted, yet the errors were not uniform across all writers. Although certainly not conclusive, the data does suggest that even thought low-level problems exist, the media was acceptable for normal use. Writers Clearly, some writers are better than others.

The best performers across all media types were the Phillips and Ricoh.

The worst performers were Pinnacle Micro and Yamaha 4x. Readers In the manual tests, all of the readers were successful at performing each of the 18 different tasks with all of the media. Again the data suggests, that although low-level problems may exist, the readers overcome the problems and recover from the errors.

After Aging Media For the manual tests, three out of the eight media types could not be read. These included Taiyo Yuden, TDK, and the Avery-labeled TDK discs. All of the other media types were acceptable.

The automated tests confirm the manual tests findings. Additionally, Mitsui, HP, and Verbatim performed the best in the CD CATS analysis. Writers Although writers were not a component in the aging process, there may be a possible interaction between the writer the impact of aging on the media. Unfortunately, no conclusive evidence exists. Readers For the manual tests, the failed reads in the “after aged” test mirrored the media failures, with three media types totally unreadable. Other sporadic errors occurred, including: 10 errors with Toshiba 2x, 5 errors with Mitsui 4x, 10 errors with Phillips 6x, and 7 errors with Plextor 8x. Anomalies As with any experiment, a number of anomalies occurred which are worth further investigation. The first is with the TDK media. TDK Disc #7 was readable after aging, while none of the other TDK discs were. Additionally, there were significant problems with the TDK media upon visual inspection. Finally, it is impossible to determine if the problem encountered with the Avery-labeled TDK discs was due to the media or the label, so results are inconclusive. Further Research There are a number of questions that remain unanswered at the conclusion of this test. The list below provides a high level overview of each.

Issue:
Suggested Research Sample size is too small to be conclusive, Increase sample size and repeat test There is no standard aggregate measure of CD-R acceptability after being recorded    Define a comprehensive standard that would simplify the debate over “good” and “bad” It is unclear if disc labeling marking has an effect on media aging     Test laser printed, labeled, and pen marked discs The Avery labeled discs performed the best prior to aging in the low level analysis     Because the Avery label protects the discs, determine if handling has a significant impact on disc writing Is there a correlation between the errors in the CD CATS analysis of the pre-aged media and the media the failed after aging?     Conduct analysis on existing data if sufficient, or perform after extensive testing is complete Test equipment inconsistency: The test results from the CD CATS and the Clover did not agree in many cases    Conduct a test isolating the analysis equipment to determine any inconsistencies What is the effect of further aging?    Send existing acceptable media back to NML for more aging What caused the failed media to fail?    Conduct additional analysis on the media (such as microscopic or chemical testing) Is there a correlation between writers and the effects of aging?    Increase writer sample size and repeat test Can certain readers read bad or poor media better than others?    Increase reader sample size and repeat test Table 4 -- Further Research Appendix A: Technical Advisory Group Al Betts National Technical Information Service (703) 487-4808 e-mail: [email protected]. Darlene Carlson National Media Lab (612) 733-5287 e-mail: [email protected] Katherine Cochrane CD-Info Company, Inc. (205) 650-0840 [email protected] Scott Fast InSolutions (415) 966-7147 e-mail: [email protected] Dennis Howe University of Arizona (602) 621-4995 FAX (602) 621-4358 e-mail: [email protected] Jason Hyon Jet Propulsion Lab (818) 306-6054 FAX 818 306-6929 e-mail: [email protected] Linda Kempster IITRI (301) 918-1037 e-mail: [email protected]    Matthew Leek New Z Productions (408) 246-5609 [email protected] D. Lorentz National Media Lab (612) 736-7270 [email protected] J. McFaul Geological Survey (703) 648-7126 [email protected] Mike Martin Jet Propulsion Lab (818) 306-6054 FAX 818 306-6929 e-mail: [email protected] Dana Parker Independent Consultant (303) 733-1435 Fernando Podio National Institute of Standards & Technology (301) 975-2947 e-mail: [email protected] Karl Schneck Time Warner (717) 383-3535 e-mail: [email protected]. Robert Starrett Independent Consultant (303) 733-1435 Appendix B: Data Points Collected This test culminates the single largest sampling of CD-R media comprehensively analyzed with 10,248 total data points (observations). The breakdown of this data point collection is as follows: Number of Specimens: 168 discs (8 media types, 7 writers, 3 pieces of media each) Prior to Aging Observations Taken (per disc): File Read and Execute (Manual)    18 CD Comparison (Automated)        3 File Size File Count Bit level compare CD CATS                 12 Clover                     4 Temperature & Humidity        2 Total:                         39 per disc Total Observations:                 6,552 After Aging (Only 112 discs) Observations Taken (per disc): File Read and Execute (Manual)     18 CD Comparison (Automated)         3 File Size File Count Bit level compare CD CATS                 12 Total:                         33 per disc Total Observations:                 3,696 TEST TOTAL:                 10,248 Appendix C: CDCATS Parameter Definitions For the CDCATS portion of the testing, Doculabs tested the following CDCATS parameters: BLER (Block Error Rate, errors per second): The total number of 24-user-byte blocks which contain errors encountered by the first decoder stage (C1) during each second regardless of size or distribution. E11 (errors per second) The total number of blocks containing one erroneous symbol encountered by the C1 decoder during each second. E31 (errors per second) The total number of blocks containing three erroneous symbols encountered by the C1 decoder during each second. E22 (errors per second) The total number of blocks containing two erroneous symbols encountered by the C2 decoder during each second. E32 (errors per second) Regarded as uncorrectable errors, the total number of blocks containing three erroneous symbols encountered by the C2 decoder during each second. E32tot is the total number of E32 errors which occurred during the test. BERL (Burst Error Length, stated in blocks) The number of consecutive erroneous blocks encountered by the C1 decoder. I3 and I11 (unitless) I3 refers to the amplitude of the reflected light from the smallest features within the stream of pits and lands normalized by the reflectivity of the disc. I11 refers to the amplitude of the reflected light from the largest features within the stream of pits and lands normalized by the reflectivity of the disc. Jitter (nS) Jitter refers to the precision with which the features on the disc are reproduced by the replication or recording process. Jitter is the standard deviation of the histogram produced by a collection of measurements for each pit and land feature from 3T to 11T. PP (Push-Pull, unitless) Measures the strength of the tracking signal used by single-beam optical system and is normalized by the reflectivity of the disc. RN (Radial Noise, nm) Measures the difficulty encountered by the tracking servo system in following the stream of pits, marks, or molded pre-groove. SYM (Symmetry, %) Indirectly indicates the balance between the pits and lands by measuring the position of the I3 signal within the envelope of the I11 signal as viewed in the HF signal on an oscilloscope. Weighted Average Window Margin (%) Combines all 36 parameters measured by the time interval analysis system into one figure of merit. Window Margin is intended to predict the decoder margin available for player-related errors remaining after the accuracy and precision of the pits (or marks) and lands of the disc itself have been considered. Appendix D: Detailed Results See attached. CD CATS data before the aging test. CD CATS data after the aging test. Result of reading test conducted after the aging test. Appendix E: CD-R Aging Procedure This Appendix, not part of the original report, was added by John VanBogart, National Media Laboratory on May 16, 1997. CD-R discs were aged in accordance with procedures described in the American National Standard -- "Life Expectancy of Compact Discs (CD-ROM) -- Method for Estimating, Based on Effects of Temperature and Relative Humidity" (ANSI/NAPM IT9.21-1996). No specific test procedure has been developed for determining the life expectancy of CD-R media. The ANSI CD-ROM LE test standard was used as the best alternative choice. The ANSI IT9.21 standard describes the aging of at least 80 test specimens distributed amongst five different environmental conditions for a minimum total aging time of 2000 to 4000 hours. CD-ROM disks are removed periodically for testing at intervals from 500 to 1000 hours. To prevent thermal and hygroscopic shocking of the media, discs are placed in an environmental chamber at ambient conditions and the oven is slowly ramped up in temperature and humidity over a period of 3 hours. Before aged media is removed from the environmental chambers, the temperature and humidity are ramped down over a 3 hour period with an intermediate equilibration period of 6 to 11 hours. There were an insufficient number of CD-R discs available in this study to allow an estimate of media life expectancy using the ANSI IT9.21 standard. Instead, all CD-R discs were placed in a single environmental chamber. Disk placement in the environmental chamber was randomized to prevent clustering of manufacturers within a specific region of the chamber. Disks were held vertically in a polycarbonate CD-ROM disc rack. Aging conditions are summarized below: Temperature:         80 C Humidity:         85% RH Incubation Duration:    A single 750 hour period ( ~ 1 month) Ramping Rates:    Half that of ANSI IT9.21 recommendations Equilibration Time:    Twice that of ANSI IT9.21 recommendations Extreme care was taken to avoid thermal and hygroscopic shocking of the CD-R media during the aging process. Ramping and equilibration times outlined in the ANSI procedure were doubled for the purposes of this study. In spite of efforts to prevent shocking of the media, the environmental chamber experienced three electrical malfunctions during the aging period. This caused the humidity in the ovens to drop to below 20% RH for periods of 1 to 4 days. The decrease in humidity occurred over a 1-2 hour period. The humidity in the ovens was restored to 85% RH in about an hour. It is not possible to quantify the life expectancy of the CD-R media on the basis of the accelerated aging procedure used in this study. Furthermore, without knowing the exact energy of activation for the degradation process, an acceleration factor cannot be determined for the environmental condition used in this study. (If the rule-of-thumb applied that media life is cut in half for each 5 C increase in the storage temperature, then storage at 80 C for one month would be the equivalent of 170 years of storage at 25 C.) At best, the test results provide a relative comparison of the chemical stabilities of the CD-R media at elevated temperatures and humidities.

blue_line.jpg (2430 bytes)

INDEX arrowboth.gif (6812 bytes)  TOP