ABC's of Authentication
A is for Atom, B is for Bit and
C is for Care
David G. Masse
|Text of a
paper presented by the author at a national summit conference held
the auspices of the Canadian Association of Law Libraries
in Toronto on * 1997 and published with the Summit proceedings on
Web site at:
 Everything one needs to know for survival and prosperity in the brave new information age is subsumed in the title of this paper. It bears a little explanation however.
 If the reader is reading these words as ink printed on paper, the reader is having an analog experience. The analog information world is by and large the world we know, and it is a world of atoms. Both the ink and the paper used to convey the information you are reading are very real and tangible. They have a physical presence which will not be denied. They are constituted of atoms. Analog methods are the methods the world has grown accustomed to since the dawn of human civilization.
 If the reader is reading these words from a computer screen, the reader is having a digital experience. The digital experience, like the word "digital" itself, has only been with us a very, very, short time. It has been with us long enough however, that it needs to be explained in order to be properly understood.
 There has been no shortage of "digital" things in our lives. The first digital products were digital clocks. They made their appearance in the late sixties and early seventies in clock radios and kitchen stoves. Later, in the mid-seventies, we were able to buy digital watches. Truly marvels of modern science. Somewhat later, digital watches became commonplace. Soon, every consumer product known to mankind was striving to proclaim that its new and improved incarnation was digital in some way.
 All of those 'digital' products indeed heralded a new world, but most were only superficially digital. The real impact of digital technology did not really make itself felt until the mid to late eighties when digital information began to be commonplace. The compact disc ushered digital sound into our homes and the personal computer, successor to the 'home computer', introduced digital information first into our offices, and then into our homes. It is this use of the word 'digital' that we really need to understand in order to appreciate the changes taking place in the world around us.
 In order truly to appreciate the impact of digitization, it is essential to understand the analog information paradigm.
 In the analog world we know, the purpose for which information is designed always dictates the form the information takes. Thus, the need to distribute the news daily in printed form, quickly, in large volume, cheaply, and over a relatively large geographical area, inevitably gives rise to the newspaper format which is universally known and used for that purpose. On the other hand, when the time constraint is somewhat more relaxed, when deadlines are monthly, more time and attention can be brought to the presentation and delivery of the information, and we then witness the phenomenon of the magazine. Once again the magazine format is a universal format for the publication of information. When conservation over long periods of time is thought to be necessary and the document is to be prepared by a diverse group of people, filed and stored, the requirement gives rise to the 'deed' format which was traditionally, and in large measure still is, used in real property conveyancing.
 In this sense, Marshall McLuhan was right when he expressed his now famous thought to the effect that the "medium is the message".(3) In the analog word, the information we convey is inescapably tied to the medium in which it is expressed and the two are therefore inextricably bound together.
 The notion of the essential originality of documents is a manifestation of the molecular bond between information and the medium in which it is expressed. The originality of any given document is that which we rely upon to authenticate the information it contains. The most striking examples of this are of course bank notes. Paper currency as we know it exploits a number of physical traits of paper and ink so as to authenticate the intrinsic value which the bank note represents. The atoms, which comprise the document, attest to its authorship and hence to its authenticity.
 Applied to legal information, the analog authentication paradigm works in the following way: law reports are prepared and published by reliable sources such as private and government-owned legal publishers, for the most part type-set, printed in large numbers, and bound in volumes distributed to hundreds of law libraries operated under the authority of university law faculties, bar associations, government ministries and law firms. The act of locating information concerning a given judgment of the Supreme Court of Canada, in a bound volume of the Supreme Court Reports, in a law library, automatically authenticates the information contained on the pages. No one is going to question in a serious way the provenance or truth of the report. This is so because the act of systematically assembling the information, binding it to so many atoms of paper using complicated and expensive processes, and then distributing those atoms from coast to coast and beyond, can only be accomplished under the watchful eye of the Canadian legal establishment. While it would be possible for a mean-spirited individual to forge a volume of Supreme Court Reports quite effectively, it would not be possible to replace all the volumes likely to be consulted in the context of a given case. The analog process, therefore, automatically authenticates the content of the law report.
 The digitization of information is a very simple process. In its simplest expression, digitization is simply the expression of information in a new language and the language is binary. That is, unlike our alphabet which is made up of 26 characters, and our numbering system which is made up of 10 characters, binary language is comprised of only two characters: "0" and "1".
 The great advantage of binary language is its overwhelming simplicity. The characters of binary language, known as "bits", while initially expressed as "1" and "0", can be expressed as well by the combination and alternation of any two distinct conditions. The presence of light and absence of light, a positive electrical charge and negative electrical charge, the peak of a wave and trough of a wave, and many other observable states of matter suffice to record and transmit information in binary language.
 Because binary language is so very basic, it is also extremely inefficient as a medium of expression. Although the letter "a" can be expressed with a single pencil stroke, its binary equivalent requires the use of at least seven bits and is written as "1100001". While the inefficiency of binary language makes it essentially unreadable to humans, it is an ideal language for machine communication.
 Submitted to the processing power of modern computers, binary language becomes a very powerful tool. Textual information in digital form can be created, recorded, printed, stored and transmitted in vast quantities and at astonishing speeds. The processing capacity of the computers used to manipulate digital information also enables the transformation of the way in which information is presented. The best example of this transformation is hypertext which allows information in one digital record to be linked to related information in another digital record, allowing the reader to pursue information in more than the single linear dimension afforded by analog publications.
 The principles of digitization apply not only to textual information but to visual and aural information as well. All information is susceptible of being created or re-created in binary language and then being recorded, played, displayed, stored and transmitted in much the same way as digital text. The combination of digital text, sound and images in the new format known as multimedia gives rise to a far richer information medium than anything which has existed before.
 In addition to the incredible richness of digital media, digital publishing raises issues of stark economic efficiency. The personal computer serves as a one-stop publishing tool. Everything from text, to sound, to images, can be created in digital form on the average multi-media personal computer and can be published on the Internet for the world to consume without resorting to editors, publishers, printers, distributors, carriers and booksellers. This is the economic phenomenon of disintermediation. More than any other aspect of digital information, it is the new economics of the information age which is driving digital information.
 The shift from analog publishing methods to digital ones thus brings tremendous benefits and opportunities, but risks as well. The very nature of digital information and the important ways in which it differs from traditional analog media gives rise to risks which need to be identified, understood and eventually managed.
 Digital publishing is the first method for disseminating information in which the information is not inextricably bound to a physical medium. For the very first time, information can be separated from the medium in which it rests and be transferred to other media. As long as the series of "1"'s and "0"'s is faithfully reproduced, digital information loses none of its initial quality. A given text, image or sound can be copied any number of times, using a variety of different techniques and recording and storage media, without suffering a degradation of its content in any respect. The concept of quintessential, molecular, atomic, originality which is the hallmark of the paper document is therefore not a feature of digital information.
 The absence of essential originality is a quality which presents substantial challenges when one attempts to establish the authenticity of a digital record. The reality of the digital document is that a bit, is a bit, is a bit. There is nothing to distinguish one bit from another. All are absolutely identical. The only thing which distinguishes one digital record from another, is the order in which the bits are presented.
 Unlike the analog paradigm for the publication of legal information, the digital paradigm cannot supply authentication as an integral part of the system. Authenticating digital content requires that special care be taken in the production of the information.
 The impact of open networks
 When the Internet is taken into account, the authentication problem associated with digital information is made more acute by several orders of magnitude.
 Commercial and professional analog voice communications have been traveling on the global telecom infrastructure for a very long time now. If, at some point in the past, the security and propriety of communicating sensitive or valuable information in this way was questioned, those questions have long since been put to rest. The introduction of widespread commercial data communications over the telecom infrastructure in the form of point-to-point data communications and facsimile transmissions has not, to date, given rise to much in the way of legal or social controversy as to the propriety of communicating in that fashion.
 Open networks, such as Internet, obey rules which differ quite materially from the traditional, circuit-switched, point-to-point telecom infrastructure with which we are now very familiar. Even though the telecom infrastructure is not extraordinarily secure and lends itself quite readily to both licit and illicit interception of traffic, we have come to terms with it and we accept that the medium it provides for our commercial communications is reasonably secure, at least as regards the vast bulk of both personal and commercial traffic with which we routinely entrust it.
 Open, packet-switched networks like the Internet derive their extra-ordinary efficiency by minimizing the infrastructure needed to allow communications to occur. The network relies on its openness to achieve its ends: binary data packets must be easily inspected by each node encountered on their trek across the wired and networked globe so that they can be handed off in the probable direction of their intended destination.
 At the present time, the business community relies, without much, if any, concern, on point-to-point voice communications over the telecom infrastructure. The needs of identification, integrity, confidentiality and authentication are met quite well, as in the case of the publication of legal information, by the analog nature of the system. The circuit-switched nature of the telephone system performs most of the task both of ensuring the integrity and authenticity of our transmissions: the number assigned by the local telephone company authenticates the terminal end of the communication (i.e., for a single residential telephone line, the physical address at which the line terminates) and the voice of the person to whom we are speaking does the rest, as we verify the subtleties of tone, inflection and intonation of the speaker against the voice of the person we remember. The integrity and confidentiality of the message we hear is vouched for by the logical coherence of speech, and our knowledge that (except for party lines still found in some rural and cottage areas) the interception of our conversation is technically somewhat difficult and is in fact quite unlikely.
 As the business community began to use the telecom infrastructure increasingly for data communications in the late 1970's, in large measure it transferred to its data communications the trust developed through long reliance on the telecom system for voice communication without giving much thought to the fundamental differences between voice and data communication. The source of our existing faith and trust in the integrity of the telecom infrastructure stems from our long collective experience in analog voice communications.
 In data communications however, the traditional authentication and verification tools we employ no longer work for us: our bits and bytes look and sound pretty much like everyone else's bits and bytes. We are able to verify that our message was received integrally in a point-to-point data communication by periodically transmitting bits back to the sender for verification against the bits originally sent, but we have no way of knowing precisely who the reply is coming from. Thus, taking the most prevalent example of data communications failure, every day, clerks in businesses all over the world transmit faxes to the wrong destination by inadvertently keying in the wrong telephone number. No one is the wiser until the intended recipient denies receiving the message. Even then, we assume that the machine failed in some way, rarely considering that the message is now in the wrong hands. Nevertheless, the risks inherent in point-to-point switched data communications have not generated much, if any, attention and concern.
 Open packet-switched networks are quite another kettle of fish however. In the case of the open network, anything goes. Communications can be diverted, copied, altered, replayed, rerouted, etc., etc., etc. We have no lingering familiarity and trust of open networks and though our data communications in open networks most often start with a point-to-point telephone link, there the similarity ends. Experts tell us that this new medium is quite insecure.
 For the Internet to perform a role as an appropriate medium for the exchange of valuable digital records, there must be a way to make sure that the senders and recipients of 1's and 0's are known with some degree of reliability and that some mischievous spirit can't easily alter the sender's sequence of 1's and 0's on their way to the recipient.
 Simply securing the Internet to make it work more like the telephone network is not sufficient to accomplish the type of messaging integrity that the eventual information infrastructure demands. A point-to-point, circuit-switched communications network like the existing phone system succeeds in supplying a degree of authentication (restricted to the registered owner of the telephone line) but does little or nothing to vouch for the authenticity of digital records.
 Handling digital information with care
 We have seen that a digital record derives its singularity not from its physical nature, but from the sequencing of its bits. A string of bits must always be interpreted in order to yield its meaning. Even ASCII(4) text, which, in some ways, is the lowest common denominator of digital information, must be interpreted if it is to reveal its meaning. If the sequence of "1"'s and "0"'s is disturbed, the string is altered, and the resulting meaning will be changed, perhaps lost. This alteration, when it occurs, is most often unintentional, but sometimes may be malicious.
 When digital information is created, used and stored in a closed environment, the risk of alteration can be managed and contained. When the same information is transmitted in open networks such as the Internet, controls are absent and the data becomes relatively vulnerable. Steps must therefore be taken to shield the data itself from interference. In addition to shielding the data, special steps must be taken to authenticate it.
 The bits themselves must always remain ones and zeroes, but the sequence in which they appear may be manipulated at will. At one end of the manipulation spectrum, it is not very difficult to alter a string of bits representing ASCII text to change the message "a,b,c" to "c,b,a". At the other end of the spectrum, it is child's play to redistribute randomly the bits in the same sequence and thereby destroy their meaning without hope of recovery. In between those two extremes lies a world of possibilities.
 Throughout recorded history, man has manipulated the sequence of written characters to mask the meaning of his messages. The art of doing so is called cryptography.
 Flavours of cryptography
 Cryptography has largely remained in the realm of state security throughout the ages. As we will see, it is only now emerging from the shadows, to serve a useful role in everyday life. Although cryptography is, and probably always will be, the exclusive preserve of advanced mathematics, some understanding of cryptography is essential in order to appreciate the role it can play in bolstering digital information so that it can play a more significant role in society.
 Until the mid nineteen-seventies, cryptography came in one dominant flavour: symmetrical cryptography. A cryptographic system is said to be symmetrical when the key that is used to encrypt data is also used to decrypt it. The encryption key and the decryption key are both the same.
the mid nineteen-seventies, researchers at the Massachusetts Institute
of Technology(5) invented a method of
cryptography. In asymmetrical cryptography, encryption keys come in
Each key of a key-pair is different from the other key in the key-pair,
although the keys are mathematically related to each other. The
relationship between the keys is such that the following holds true:
 Digital signatures
 The technique of the digital signature lies at the heart of large-scale data authentication.(6) It is a technique with many subtleties and it lends itself, in various guises, to some very interesting authentication possibilities. It is therefore important to understand how a digital signature is made. One way to acquire an understanding of digital signatures is to consider the following simple example.
 Let us say that Alice and Bob wish to exchange e-mail messages over the Internet and that they wish to have a high degree of assurance that the messages they exchange are confidential and authentic. Alice and Bob each have personal computers on which e-mail software and public key encryption software have been installed. They have each generated an encryption key-pair. In each case they have carefully kept secret one key of the key-pair (which for this example we will call their respective private keys), while they have exchanged the other key of the key-pair with each other. The keys which they have exchanged we will call their public keys. Once this public-key exchange is complete, Alice has her key-pair as well as Bob's public key and Bob has his key-pair as well as Alice's public key.
 Alice intends to send a message to Bob: "Please meet me at the market at 4 this afternoon". Her e-mail program calls on the encryption program. She elects to encrypt the message (for privacy reasons) and to append her digital signature to it (for authentication purposes). She has the choice of doing either, or both. The encryption software first takes the message and distills a 'hash' value of it using an encryption algorithm. A hash function is a one-way encryption-based function which calculates a kind of digital fingerprint for any given message. Let's say that the hash value of the message "Please meet me at the market at 4 this afternoon" is "4496". The nature of a hash function is (a) that it operates in one direction only, so that it is impossible to recreate the original message starting from its hash value, and (b) that a change of as little as one bit of the original message will produce a very large change in the message's hash value. It is practically impossible for even slightly different messages to generate identical hash values. The software then encrypts the original message text using Bob's public key. It then encrypts the hash value of the message using Alice's private key. The encryption of the hash value with Alice's own private key is referred to as her digital signature. The encrypted messages may then be sent to Bob over the Internet, or by an equally insecure method.
 During transit over the Internet, the messages can be inspected by curious eyes at will. Alice is very popular and her public key is very widely available. In fact, it has been published in directories which are widely available. Her private key is a deep secret known only to herself. Bob's private key is similarly known only to him. Thus, while anyone in possession of Alice's public key can decrypt the hash value of the message and obtain "4496" (since, having been encrypted using Alice's private key it can be readily decrypted using her public key), they are incapable of making head or tail of the message itself since they have no access to Bob's private key. Nor can they recover the text of the original message by manipulating its hash value.
 Bob receives the messages in due course. Like Alice, Bob's computer has an e-mail program which is integrated with an encryption program similar to Alice's. When Bob attempts to read Alice's message, a number of things happen. First, Bob is prompted for the passphrase of his private key which he supplies. Using Bob's private key, the software then decodes the message and obtains "Please meet me at the market at 4 this afternoon". Because Bob wants to be sure that the message truly comes from Alice (Bob is also very popular and women are always trying to trick him into meeting them at the market by masquerading as Alice), he requests that the software verify Alice's digital signature.
 The encryption software in Bob's computer first takes the message "Please meet me at the market at 4 this afternoon" and distills a 'hash' value of it using the same hashing algorithm as that employed in Alice's encryption software. The message yields a value of "4496". The software then uses Alice's public key and attempts to decrypt the encrypted message hash received from Alice. The decryption succeeds and yields the original hash value "4496". The two hash values are then compared and found to be identical. The software then proclaims on Bob's computer screen a message which reads "Good digital signature from Alice".
 As a result of the basic principles at play in asymmetrical cryptography, it is possible to conclude in the above example that the message sent by Alice to Bob was authentic and has not been tampered with en-route for the following reasons:
 Public Key Infrastructures
 In order to perform their authentication magic, public key cryptography and digital signatures must rely on the existence of an infrastructure designed to permit public keys to be widely disseminated with a high degree of assurance. Digital signatures work very effectively to authenticate digital records in otherwise insecure environments. In order to work well however on a large-scale basis, it is necessary for all users to know, with a relatively high degree of assurance, the public keys of the persons with whom they wish to exchange authenticated data. Without reliable access to the author's public key, there is simply no way to verify a digital signature.
 Public key infrastructures (or simply "PKI") are the amalgam of software, standards and institutions which, taken together, allow for the dissemination of the encryption software and the dissemination and management of public keys. It is beyond the scope of this paper to explain in detail the functioning of public key infrastructures(7) or to mention all of the companies which offer public key infrastructure related products. The description which follows is merely intended to give a rough idea of the breadth of the implementation of this technology at this time.
to its simplest expression, a public key infrastructure comprises the
 It is important to consider that the software industry and the community of major software users, including government and large corporations, are very much committed to the development and deployment of public key infrastructures. Examples of this commitment on the part of state governments in the United States can be seen on the PKI website operated by the State of Masschusetts.(36) In Canada, the development and deployment of PKI was stated as a key recommendation of the federal information highway task force. The Information Highway Advisory Council established by Industry Canada to make recommendations in relation to the deployment of the information highway in Canada, recommended to the minister of Industry that measures be taken to establish public key infrastructures in Canada.(37) The Canadian government is currently implementing a public key infrastructure based on Nortel's Entrust application.(38)
 Other data authentication methods
 Digital signatures are not the only technique by which data can be authenticated. Other technology exists as well. For example, there is the technique of the electronic signature, which is software designed to allow the user to manually sign a digital record using a stylus on a pen-enabled computer screen or on a digitizing tablet. Such systems use biometric techniques to analyze the handwritten signature so as to obtain a measure of its unique attributes. The digital record, the digitized image of the manual signature, the user's identity profile and the handwriting analysis are then bound together using cryptographic techniques so that the user's signature, their profile and their signature authenticate the document.(39)
 There are also weak authentication systems which, for lack of a better expression, may be referred as digital paper.(40) The technique in digital paper systems, is to replace the binary file containing the information in machine and human readable form (i.e. a binary ASCII, desktop publishing or word processing format file) with an image of the document identical to the image created when the document is printed using the application with which it was produced. Digital paper provides a tamper resistant envelope for digital information in the sense that it would not be a trivial exercise to alter the digital record so as to change the message "a,b,c" to read "c,b,a". Some degree of authentication is thereby provided. Nothing of course prevents the digital record from being completely replaced by a forgery, in the same way that a paper record can be forged. Such systems serve a very useful purpose, particularly where it is important to transmit the exact equivalent in digital form of a paper record, but they do not in and of themselves satisfy an authentication function.
 In the end, all data authentication techniques must come to rest on some form of encryption, whether the encryption is formal encryption as in symmetric or asymmetric encryption techniques, or informal in the sense that any binary file in a proprietary format can be said to be 'encrypted' because it necessitates the recipient having a 'key' in the shape of the software capable of reading the file format.
 At the present time, the most robust, scalable, and effective means for authenticating digital records is that of the digital signature and its related applications.
 A word about digital watermarking
 Digital watermarking describes a series of digital techniques designed to address another problem associated with the malleability and lack of originality which are the hallmarks of digital records.
 Digital watermarking imbeds within a digital record a substream of information which tends to identify the origin or authorship of a digital record. Digital watermarking does not, in and of itself, really address the issue of the integrity or authenticity of a digital record. It does not authenticate the document because it does not create a reliable and verifiable link between the identity of the author of the digital record and the digital record itself as is the case with digital signatures employed in a public key infrastructure. A digital watermark is so difficult to separate from the content it marks that economically it is not worth doing. Nothing however precludes forged watermarks from being created. While the digital watermark may therefore provide a powerful tool for the protection of intellectual property in digital records, it is not really suited to authenticating content per se.
 Depending on the technique employed (and there are several techniques in existence) either the entire sequence of bits, or a random sampling of strings of bits, of a digital record, have inserted among them, an additional sequence of bits containing the information relating to the origin of the record. They are the equivalent of burying a covert message as "noise" in an analog sound recording. The digital watermark conveys its information without being perceived when the resulting work is played or displayed.(41)
 By its very nature, digital watermarking only works with digital records which reproduce an image or sounds by a process of sampling,(42) and will not work where the document has to be literally rendered, as is the case with a document which renders machine readable text (like a digital record of text in ASCII, proprietary word processing or HTML format) or with executable files which contain a computer program. It is a technique which can be employed for text but only in conjunction with digital paper techniques since in the case of digital paper, the words in the document are an image or visual sample of the text rather than the text itself.(43)
signatures may be used either as a means of digital watermarking for
or executable files or as an adjunct to other digital techniques used
digital watermarking. Where digital signatures are used, strong
of the digital record becomes an option.
1. The purpose of this paper is to explore the nature of digital records and the challenges and opportunities they present when we seek to replace paper-based processes with ones based on digital records. The primary focus of this paper is on the question of authentication. In this context the word "authentication" is used in its normal dictionary sense as follows (Pocket Oxford Dictionary):
"authentic a. (~ally). Trustworthy, entitled to acceptance, (authentic statement); of undisputed origin, not forged etc., (authentic documents, pictures); ~ate v.t. (~able), estatblish truth or authorship or validity of (statement, document, claim); ~ation, ~ator, authenticity, ns. [F.f. L f. Gk authentikos genuine]"There are other uses of the term, in particular in the law of evidence as regards the production of documents, where more or less precisely defined criteria apply and where a manual signature may play an important role. The term is used here in its common most broad sense to describe the ability of a recipient of information to know, with reasonable certainty, the origin and integrity of a given digital record.
The paper is not intended to propose any particular course of action, merely to supply background information intended to facilitate informed discussion of the authentication issues raised by digital records.
2. The author is a member of the Bar of the province of Quebec and Assistant Corporate Secretary of BCE Inc. and Bell Canada.
3. Marshall McLUHAN, http://www.mcluhanmedia.com/mmclm001.html
4. The term "ASCII" stands for American Standard Code for Information Interchange. The ASCII standard is a table defining, in binary form, 128 standard characters comprising the alphabet, punctuation, the number set and certain control characters such as carriage returns, line-feeds and the like.
5. Ron Rivest, Adi Shamir and Len Adleman. They put the "RSA" in RSA Data Security Inc. See http://www.rsa.com.
6. By large-scale data authentication the author means that the infrastructure necessary to permit it would be ubiquitous, on a national, international and even global scale. In other words, all publishers and authors of legal information would have the choice of digitally signing their published data if they wished to do so, and that all readers would be able to verify the digital signatures related to the data they receive, if they chose to do so.
7. For the reader who wishes to acquire a deeper knowledge of the working of public key infrastructures the author suggests the following materials: Michael FROOMKIN, The essential role of trusted third parties in electronic commerce 75 Oregon L. Rev. 49 (1996) available online at http://www.law.miami.edu/~froomkin/articles/trusted.htm; C. Bradford BIDDLE, Misplaced Priorities: The Utah Digital Signature Act and Liability Allocation in a Public Key Infrastructure, 33 San Diego L. Rev., available in an earlier version at http://www.SoftwareIndustry.org/issues/1digsig.html; David MASSE, Economic Modelling and Risk Management in Public Key Infrastructures, text of a conference given by the author at the RSA Data Security Conference on January 31, 1997 in San Francisco, available online at http://www.chait-amyot.ca/docs/pki.html.
18. Information Week, E-commerce gets real, December 9th, 1996.
28. http://www.verisign.com and in particular http://www.verisign.com/smime/nsemail.html
30. The Bell Sygma OnWatch service is described at http://www.public-key.com/index.html
37. See recommendations 10.12 and following of the report published in September of 1995 entitled Connection, Community, Content - The challenge of the Information Highway, available online at http://strategis.ic.gc.ca/SSG/ih01070e.html. The recommendations relating to the establishment of PKI may be found at http://strategis.ic.gc.ca/SSG/ih01041e.html.
38. See section 8.2 Encryption and Digital Signatures in the report entitled Standardized Electronic Forms Information Interchange: Pilot Project Summary Report prepared for the Electronic Document Standards Working Group (EDSWG) of the Treasury Board Secretariat of the Government of Canada available on the Treasury Board Website by searching the keywords "public key infrastructure" at http://www.info.tbs-sct.gc.ca:80/cgi-bin/searchCGI?language=English
39. See for example the PenOp authentication system at http://www.penop.com/index.htm.
40. For a leading example, consider the Adobe Acrobat system which generates files in 'portable document format' or simply PDF. See http://www.adobe.com/prodindex/acrobat/main.html. See as well the Common Ground application offered by Hummingbird Communications Inc. http://www.hummingbird.com/cg/whitepapers/dpweb.html.
41. See Business Week, Copyright's new digital guardians, available online at http://www.businessweek.com/1996/19/b347474.htm and Byte Look, it's not there - Digital watermarking is the best way to protect intellectual property from illicit copying, available online at http://www.byte.com/art/9701/sec18/art1.htm.
42. Sampling is the process used to produce a digital rendering of images or sound. In the case of an image, bits are used to map the image which appears on a screen or which is sent to a printer. While an object's image is made up of an infinite number of points of light, a picture of the object in a newspaper is comprised of a number of dots (picture elements or pixels) which form the image by being switched on (let's say to black) or off (in this case to white). Each pixel has coordinates within the matrix (screen or display area) and may thus easily be represented by a "1" or "0". Each individual bit contributes very little to the overall result. There is therefore "room" to insert extra bits to carry other information without disturbing the ability to render a likeness of the original image (just as the scanning bar of a fax machine often introduces black streaks on faxes which most often don't interfere with the readability of the message itself). In the case of sound, the sound wave is similarly placed on a graph and the resulting wave is sampled by taking reference points along the curve. Just as an image is made up of an infinite number of points of light, the analog sound wave is made up of an infinite number of points on the wave. Digital sampling cannot record all points on the wave, nor does it need to in order to render an acceptable likeness of the sound. Just as in the case of the image, it is possible to insert extra bits in the transmission which do not interfere with the resulting sound when it is replayed.
for example the more detailed and technical information made available
by The DICE Company, developers of Argent, digital watermarking
available from their web site at http://www.digital-watermark.com/ArgentFAQ.htm.