2017 Edition

Welcome to the Winter 2017 Sagesse Publication!

1. Introduction – Sagesse Winter 2017 (384 KB)

by Uta Fox, CRM, ARMA Canada, Director of Canadian Content

2. Memory as a Records Management System (471 KB)

by Sandra Dunkin, MLIS, CRM, IGP and Cheri Rauser, MLIS

3. RM in Canadian Universities – The Present & Future (329 KB)

by Shan Jin, MLIS, CRM, CIP

4. Electronic Recordkeeping – From Promise to Fulfillment (370 KB)

by Bruce Miller, IGP, MBA

5. D’archivage électronique – De la promesse à l’accomplissement (439 KB)

par Bruce Miller, IGP, MBA

6. From Chaos to Order – A Case Study In Restructuring a Shared Drive (957 KB)

by Anne Rathbone, CRM and Kris Boutilier

Meet the Authors

Sandra DunkinMLIS, IGP, CRM, is the Records & Information Management Coordinator for the First Nations Summit Society.  Sandra is also currently the Program Director for the ARMA Canada Conference, Chair of ARMA Vancouver’s First Nations RIM Symposium Committee, and a member of ARMA International’s Core Competencies Update Group.

 Currently working as an academic librarian in online distance education Cheri Rauser, MLIS, enjoyed exploring the current state of cognitive informatics and neuroscience while collaborating on this article. Besides working in academic librarianship Cheri has been employed as a university lecturer, museum cataloguer and moving image archivist. Other research interests include: the role of the library in accreditation and web-based indexing vs paper indexing.

Shan Jin, MLIS, CRM, CIP,  is a Records Analyst/Archivist at Queen’s University Archive. She earned a master degree of library and information studies from Dalhousie University, is a Certified Records Manager and a Certified Information Professional. She has contributed to several ARMA technical reports. Shan can be contacted at jins@queensu.ca.

 Bruce Miller, IGP, MBA is President of RIMtech, a vendor-neutral records technology consulting firm. He is an author, an educator, and the inventor of modern electronic recordkeeping software. The author of “Managing Records in Microsoft SharePoint”, he specializes in the deployment of Electronic Document and Records Management Systems.

With a twenty-two year career in Local Government IT, Kris Boutilier has overseen numerous reinventions of technology. From DOS 3.2 to Windows 10, 300 baud dial-up to 100Mbit broadband, the IBM Selectric to Office 365. Transitioning from Physical to Electronic Records Management has proved the most challenging undertaking yet. Contact Kris at Kris.Boutilier@scrd.ca.

Anne Rathbone, CRM,  has 20 years RIM experience, all with local governments.  She was one of the leaders on the shared drive project, provides all staff training on the new shared drive and is responsible for maintaining the integrity of the new framework. She echos Kris’ sentiments about e-records.  Contact Anne at Anne.Rathbone@scrd.ca.

Sagesse: Journal of Canadian Records and Information Management an ARMA Canada Publication Winter, 2017 Volume II, Issue I

 

Introduction

 Welcome to our second issue of Sagesse: Journal of Canadian Records and Information Management and an ARMA Canada publication!

In March 2016, ARMA Canada launched its first issue of this publication under the working title of Canadian RIM, an ARMA Canada Publication. At the same time we announced a contest to get suggestions for a title from the ARMA Canada membership that focused on Canadian records and information management and information governance. Deidre Brocklehurst, from Surrey, British Columbia, suggested Sagesse: Journal of Canadian Records and Information Management which was the top choice for the Canadian Content Committee, which is now called Sagesse’s Editorial Review committee (the committee) and we congratulate Deidre for such an appropriate title.

Sagesse (pronounced “sa-jess”) is a French word meaning wisdom, good sense, foresight and sagacity which is most appropriate for the mandate of ARMA Canada’s publication. It embodies Canada’s unique heritage as well as instilling knowledge, wisdom and common sense.

Sagesse’s Issue I in 2017 features the following articles:

 

  • “Memory as a Records Management System,” by Sandra Dunkin and Cheri Rauser highlights how our brains process, organize and retrieve information through memory and recall patterns and garner that into a records management system. This is indeed a unique approach to records management.
  • Shan Jin presents an interesting and thorough discussion on records management in Canadian universities in her article entitled, “Records Management in Canadian Universities: the Present and the Future.” She provides a comprehensive view on current records management practices in Canadian
  • In “From Promise to Fulfillment: The 30-Year Journey of Electronic Recordkeeping Technology,” Bruce Miller shares his intriguing personal journey in the development of electronic recordkeeping software technology. This article is also translated into French.
  • And, Anne Rathbone and Kris Boutilier provide a case study on the Sunshine Coast Regional District in British Columbia and their enticing and ambitious undertaking of restructuring a shared drive used by all employees in their article, “From Chaos to order – A Case Study in Restructuring a Shared Drive.”

 

We’d also like you to be aware that there is a disclaimer at the end of this Introduction which notes that the opinions expressed by the authors are not the opinions of ARMA Canada or the committee. If after reading any of the papers you find that you are in agreement or have other thoughts about the content, we would certainly like to hear them and urge you to share with us. We’ll try to publish your reflections in our next Issue. And if you have any recommendations about our publication please share these as well. Opinions and comments should be forwarded to: armacanadacancondirector@gmail.com.

What goes into putting this type of publication together? First of all, we need you and your RIM-IG experiences! Then, Sagesse’s volunteer Editorial Review committee is on hand to assist you. We have received some amazingly unique articles for publication and we applaud our authors for their dedication to our Canadian industry.

It takes time to prepare each edition, from the point when we approach authors to when we actually are able to publish an article. Information about the types of articles we are interested in and the process through which the articles go is available on ARMA Canada’s website – www.armacanada.org – see Sagesse.

One other item I would like to draw your attention to is our shamelessly promoting a session at ARMA Canada’s upcoming conference in Toronto, ON, in 2017. Two of Sagesse’s Editorial Review committee members, Christine Ardern and John Bolton, will deliver a presentation on writing for Sagesse – something we encourage each of you to pursue.

 

Enjoy!

ARMA Canada’s Sagesse’s Editorial Review Committee:

Christine Ardern, CRM, FAI John Bolton

Alexandra (Sandie) Bradley, CRM, FAI

Uta Fox, CRM, Director of Canadian Content Stuart Rennie

 

DISCLAIMER

The contents of material published on the ARMA Canada website are for general information purposes only and are not intended to provide legal advice or opinion of any kind. The contents of this publication should not be relied upon. The contents of this publication should not be seen as a substitute for obtaining competent legal counsel or advice or other professional advice. If legal advice or counsel or other professional advice is required, the services of a competent professional person should be sought.

While ARMA Canada has made reasonable efforts to ensure that the contents of this publication are accurate, ARMA Canada does not warrant or guarantee the accuracy, currency or completeness of the contents of this publication. Opinions of authors of material published on the ARMA Canada website are not an endorsement by ARMA Canada or ARMA International and do not necessarily reflect the opinion or policy of ARMA Canada or ARMA International.

ARMA Canada expressly disclaims all representations, warranties, conditions and endorsements. In no event shall ARMA Canada, its directors, agents, consultants or employees be liable for any loss, damages or costs whatsoever, including (without limiting the generality of the foregoing) any direct, indirect, punitive, special, exemplary or consequential damages arising from, or in connection to, any use of any of the contents of this publication.

Material published on the ARMA Canada website may contain links to other websites. These links to other websites are not under the control of ARMA Canada and are merely provided solely for the convenience of users. ARMA Canada assumes no responsibility or guarantee for the accuracy or legality of material published on these other websites. ARMA Canada does not endorse these other websites or the material published there.

Authors’ Forward to Memory as a Records Management System

 

This paper was originally written in 2000 as a required assignment for the University of British Columbia’s (UBC) School of Library, Archival and Information Studies’ (SLAIS) LIBR 516: Records Management course, taught by Alexandra (Sandie) Bradley. The paper was subsequently published in the ARMA Vancouver Chapter newsletter, VanARMA, Volume 35, Issue 7, February 2004 (see newsletter introduction below).

The authors thoroughly enjoyed the initial collaborative and creative process when first drafting this paper as students and were thrilled to be asked to revisit this topic for Sagesse. The original paper now seems charmingly naïve in parts and certainly dated as we reference ‘palm pilots’ and other outdated technology concepts and limitations of the year 2000. And yet much of the substance of our original thesis is still compelling over the intervening 16-year period of development.

In the years since the paper was first written and published, the authors have been influenced by the many changes occurring in the fields of library science and records management, neuroscience, technology research and development, the documented challenges to cultural biases around collective memory as well as the growth in our own understanding stemming from personal experience and increased professional knowledge as our careers progress. The authors have endeavoured, in this iteration, to update our understanding of these and other topics through the exploration of new research into neuroscience and jurisprudence, as well as including an expanded discussion of collective memory/oral traditions as a valid means of historical/cultural record keeping, with a particular focus on Canadian Aboriginal oral culture resulting from one of the authors’ professional employment experience.

The original paper remains almost in its entirety (with quotes referencing palm pilots and all), however, it has been gently restructured to fit within the context of our new research and expanded thesis development. The original introduction to the paper from the VanARMA publication has also been retained below as it is no longer available in print or online. It has been a joy to revisit this topic and review new research to expand our own understanding of this complex topic.

Sandra Dunkin & Cheri Rauser, Vancouver, BC September 2016

VanARMA Introduction to Memory
as a Records Management System, February 2004:

By: Sandra Dunkin, VanARMA Newsletter Editor

This month features an article on how the human brain processes, organizes, and retrieves information through memory and recall patterns. It outlines many of the basic memory functions the human brain is capable of and compares them, more than favourably, with modern technological devices designed to recreate those very processes artificially.

Many of us spend countless hours in front of our computers, structuring databases, creating classification and retrieval systems for records of all types on a variety of media. What if we could improve upon the document management systems we use daily by having them mimic our natural cognitive processes?

As technology speeds ahead with new, bigger and faster means of storage and retrieval of records, we have a unique opportunity, Muse-like, to inspire software designers, hardware engineers and other assorted computer geeks on how to create better utilities to manage these masses of information. We already have voice recognition capability, but what about making access points to data storage more flexible, more “human”?

The human brain has a long history of storage and retrieval processes with a plethora of access points that can be as humorous and surprising as they are effective. Databases and other software applications are the tools of our profession. Perhaps it is time to consider how we really want/need them to work for us.

Mnemosyne, one of the Titans of Greek mythology, Goddess of Memory and, by Zeus,
mother of the Muses. According to Mary Carruthers (1996),
memory was the most noble aspect of ancient and medieval rhetoric.
Oil painting by Dante Gabriel Rossetti, 1881. Collection of the Delaware Art Museum,
Wilmington. Gift of Samuel and Mary R. Bancroft.

 

Memory as a Records Management System

by Sandra Dunkin, MLIS, CRM, IGP & Cheri Rauser, MLIS

 

Introduction

In the last 15-20 years there has been an exponential increase in the use of mobile technology even though some twenty-first century luddites bemoan our embracing of such technology. Those of us who appreciate the conveniences and rely on it to do our jobs, want to understand how we can develop technology that draws on the human capacity to store and retrieve information using the human brain as a template for future records management systems.

Rather than sound a warning bell of dire consequences if we don’t halt our engagement with mobile technology, the authors’ intention via this work is to highlight and validate the intrinsic value in human brain based oral memory creation and its application in records and information management (RIM). While acknowledging the inherent danger of information overload, the authors will explore the potential of orality in records management endeavours: past, present and future. Further, to highlight the value and the connections between what humans have always done and are working to improve, while still utilizing modern technology, the authors will explore some methods in which oral traditional cultures encode records in memory and discuss some of the ancient functions of oral records managers.

 

The Brain and Records Management

An early response to the increase in mobile technology came from Kate Cambor (1999) who suggested that people were becoming overly reliant on an “accumulating external memory network” of aides-memoire in the form of computers, palm pilots and other storage and retrieval devices (2). Cambor maintained that increasing dependence on such externalised media had been to the detriment of training our neurological filing cabinets (aka memory) to perform their autonomic tasks of classifying and retrieving data. Jim Connelly, CRM (1995) had made similar warnings five years earlier: suggesting that the brain’s capacity for retrieving information is a “common” records management tool that is being largely forgotten and ignored (35).

The authors maintain that records managers and information professionals are in the position of assisting us in managing the information overload that can result from access to far greater amounts of information than ever before. The professional skills and techniques of this group can be harnessed to help mitigate that potential overload through accessing the human brain’s innate capacity to make sense of what appears at first to be nonsensical and disconnected. The authors will show how the technology of the human brain and the methods that humans have employed to develop oral systems to store information (corporate memory) can provide clues for modern records managers to design systems that enhance, rather than work against, the human brain’s capacity for logical storage and retrieval of information (Wang, 2003).

Records and information management (RIM) professionals could, in the near future, plan information systems that take into account the logical mental cues that enhance people’s memories and therefore their ability to retrieve information from both mental storage facilities and computerised storage systems. According to Jim Connelly (1995), information and records managers need to recognise that “memory is . . . the most common information retrieval software known to man [sic]” (35). Understanding how the human brain stores information could enhance our ability to anticipate how the brain strategically files or searches for data within a central records management system, in a library database, an online catalogue, in an encyclopaedia or on the internet.

But, in order to employ the logic of human-filed memory we must first understand how memory is encoded in the brain and the roles that oral memory systems have traditionally played in how individuals and, therefore, societies remember. To that end, the authors will review the concept of oral memory and oral traditions, as well as the science of the brain memory functionality. This context is essential in formulating theories for improvement of modern records management systems – storage, maintenance, retrieval and disposition. The tenacity and perseverance of oral records should inform the paradigm by which we approach modern RIM practise, especially in this age of Big Data and the proliferation of stored records.

 

Oral Traditions and the Written Record

Circa 2500 years ago, in the Phœdrus, Plato asserted that Socrates saw the development of alphabets as crutches that limited the capacity and usefulness of the brain as the central storage system for human knowledge. “Writing, far from assisting memory, implanted forgetfulness into our souls” (Plato 370 BC, 274c-276e, Translation by Fowler 1925, and Kelber 1995, 414). Using the written word during this time and in the context of aeons of aurally transmitted records management culture was rather like everyone keeping a personal copy in today’s automated office environment. Plato further asserts that Socrates believed that written words were antisocial because they segregated themselves from living discourse, suggesting that, much like painting, “writing maintains a solemn silence”; they stare at readers, telling them “just the same thing forever” (Plato 370 BC, 274c-276e, Translation by Fowler 1925, and Kelber 1995, 414).

In the early renaissance, the rise of Gutenburg’s printing press made the written record more accessible and led to the broad dissemination of various religious and political ideologies and propaganda that threatened the powerbase of the aristocratic rulers of Europe. Printing was initially viewed with suspicion and contempt as a means of disseminating unapproved and non-authoritative information, and printing in the renaissance period remained largely exclusive to the literate elite of society. The eventual democratization and broad distribution of written information over time has been largely beneficial to society, while at the same time diminishing the experiential aspect of dialogue and comprehension and contributing to the decline of oral memory records and oral history as true and credible accounts. The advent of writing may therefore be credited with the degradation of the value of the human memory in the management of authentic historical records.

So, just how credible and reliable of an authenticator is the written record? The written record is a subjective snapshot caught in time is static and serves to externalise individual and collective memory. If only one person’s written account of an event survives, that person’s perception becomes the permanent record, complete with that individual’s subjective biases and interpretation.

It must also be noted that the written record is often ephemeral in nature, subject to all varieties of destruction and disaster (fire, flood, political ideology, and disintegration of the materials on which it is recorded). And once destroyed, it is lost forever. Survival of early written records is, therefore, inconsistent and often a matter of chance.

In contrast, the oral record is a living entity and as such the collective oral memory is more reliable by right of common ownership within the entire community or culture. It is by essence collectively ‘authenticated’ and preserved. The oral tradition is derived from the community’s sense of what happened and what is important to preserve. “Memory, not textuality, was the centralizing authority” in cultures based on oral tradition (Kelber 1995, 417). The oral or memory record is passed on as a living entity that changes with new understanding and belief about the event, thereby reflecting the community’s, rather than the individual’s belief about the truth of the record. Rather than being subjective or revisionist, oral records reflect a composite of understanding that is enriched with time and interpretation.

Updating the information contained in oral records is simply a matter of updating your memory or belief about a certain event or idea. According to Ginette Paris (1990),

“The memory at work in oral cultures allows for modification and adjustment, sometimes reversing the meaning of an event. It’s an active memory, which breaks into consciousness through archetypes, dreams and myths, fantasies, symbols and artistic work. It selects and organizes the past, putting into context what is recollected” (121).

Once something is committed to writing, it often becomes the ‘official’ version of the event and it becomes the permanent record. The problem with this process is that the written record is by nature static and inflexible. It may be superseded by another version, whether the original version remains intact or is physically destroyed. So, rather than seeing the rewriting of history in a contemporary context as dangerous revisionism, we can see it as an attempt to recapture the experience of living oral history. The written record can and has been used as propaganda that may seriously alter our perception of past events, especially if only one subjective version is maintained.

For example, many of us view the words attributed to Elizabeth l in the speech at Tilbury of 1588 as an accurate and contemporary record, however the only surviving written account exists in a letter of Leonel Sharp in 1624. The existence of such a record at 36 years removed is concrete evidence of oral tradition at work and accepted into the corpus of so-called authentic written records. The irony of this example is that a culture that itself colonized and dismissed oral tradition has itself relied on orality to lay claim to instances of its own history and culture. Additional cross-cultural examples are found in complex religious belief systems where the devout accept as a matter of faith the accounts recorded, long after from oral tradition, are true renderings of the events. In certain contexts, the disdain for oral traditions can be equated with a cultural racism and the desire to dominate over ‘other’ cultures, providing justification for egregious exercises of power.

In contrast to the ancient world of Homer and other oral-tradition records managers such as the Greek – aoidsz, Anglo-Saxon – Scop, Irish – Poet-Ollam, Italian – Cantastorie, French – Jongleur, English – “Singer of Tales,” the modern world faced by information and records managers:

“is complicated and we are inundated with information as never before. So instead of straining our own frail memories, we arm ourselves with an array of elaborate aides-memoire; rolodexes, filofaxes, palm pilots and of course computers. Such aids have existed in one form or another throughout written culture, . . . their growing use is evidence that the locus of memory itself has left our individual, biological memories and is now part of an “accumulating external memory network” (Cambor 1999, 2).

Over the centuries, writing has alienated a large segment of the world’s population. Indeed, until the 20th century, the majority of the world’s population had been illiterate: their learning and knowledge had been based on oral traditions. It is a significant cultural loss that cross-culturally, with notable exceptions, we have lost the capacity to exercise our brains in more than the basic autonomic processes necessary to encode memory. Today, our capacity to remember and retrieve volumes of information is so diminished that we now seek artificial memory enhancement – evidenced in the growing use of ‘natural’ pharmaceuticals such as Gingko Biloba, and other forms of brain training (Lumosity, for example).

Because they operated in an oral tradition, ancient oral rhetoricians (poets, bards, statesmen etc.) had to train their memories by using devices such as alliteration, rhythm, rhyme, stock epithets and synonyms. Irish poet-ollams (several levels of expertise above a bard) spent at least 12 years of their lives in a poet’s apprenticeship, training and memorising the volumes of tales necessary to their trade (MacManus 1967, 179-180). Genealogical inventories and epic histories were two of the methods employed by oral record-keepers and rhetoricians to classify, store and retrieve information vital to their culture and to their professions, and practitioners were highly regarded and respected within their cultural base.

Elaborate genealogical charting such as those invented by the poet-ollams of pre- alphabetic literate Ireland, conveyed the familial and cultural history of a people through lineage as described in the epic Tàin Bó Cúaligne (The Cattle Raid of Cooley). Futuristic societies such as the Klingon Empire, invented by science fiction writer Gene Roddenberry, are based on the Anglo-Saxon culture that values lineage, heritage and honour above all else. And the Judeo-Christian Bible is replete with genealogical inventories covering hundreds of years of corporate memory. All of these inventories were oral in origin, lasting for hundreds if not thousands of years in that form before being written down.

Cultural history is therefore corporate memory and it is in danger of being lost to our over-reliance on media outside ourselves. Just as oral tradition was largely replaced with written records that may or may not be credible, external technology designed to remember tasks on our to-do lists, or to record long-term memory, is replacing even the simplest brain based records management tasks.

Children raised in cultures built on oral tradition access their cultural heritage through oral records stored in memory banks located in the memory of every member of that culture. Children raised in alphabetic cultures learn that the written word, or contemporaneously the televised image, is the path to self-knowledge and cultural comprehension – televised entertainment as the official record-keepers of our culture. And if you don’t have time, you can just record the information for later, contributing to a possible erosion in our ability to access our innate ability to use our brain to record information and to access when needed at a later date.

Human brain activated and recorded oral tradition/history preserves the corporate memory of families, societies and entire civilisations. Individuals brought up in an oral tradition are trained to process, store and retrieve large volumes of information by activating the enormous capacity of the human brain to organise, store and retrieve information in the form of memories. In oral traditional cultures, the brain’s capacity for memory enhancement, storage and retrieval has been used as the means of training each successive generation to remember their collective history and to preserve that culture’s vital records. Hence, archaeologists were able to find the city of Troy from the account of Homer’s Iliad, a collection of oral traditional stories spanning 8-10 centuries of Aegean cultural history: attesting to the veracity of oral record keeping in the absence of written records.

 

The Canadian Oral Tradition Context

The past disdain for oral veracity in record-keeping is notable in the history of the Canadian Federal and Provincial Justice systems with regard to Aboriginal oral history and oral traditions. However, there has been a subtle shift in the perception of collective memory and oral traditions in recent Canadian jurisprudence. This change reflects a reversion to previous generations’ respect for the orality of information and intellectual discourse. The emphasis on the written record as the primary authoritative record for corroboration of events and transactions is being challenged, with a shift away from the written authority on which western culture has based its educational, judicial and cultural institutions and practices.

The landscape is definitely changing, with such landmark decisions as Sparrow, Guerin, Delgmauukw and T’silhqot’in, in which recognition and respect for the collective memory has prevailed over the judicial requirement for documentary evidence referencing a pre-literate era in Aboriginal societies. Justice Vickers stated in his decision that:

Courts that have favoured written modes of transmission over oral accounts have been criticized for taking an ethnocentric view of the evidence. Certainly the  early decisions in this area did little to foster Aboriginal litigants’ trust in the court’s ability to view the evidence from an Aboriginal perspective (Tsilhoqot’in v. British Columbia, 2007 BCSC 1700, para. 132).

And later,

Rejecting oral tradition evidence because of an absence of corroboration from outside sources would offend the directions of the Supreme Court of Canada. Trial Judges are not to impose impossible burdens on Aboriginal claimants. The goal of reconciliation can only be achieved if oral tradition evidence is placed on an equal footing with historical documents (ibid, para. 152).

This recent development in Canadian jurisprudence also supports the respect for Aboriginal oral history and oral traditions, wherein the majority of early post-contact western society was largely also illiterate in the alphabetic sense. John Ralston Saul (2008) suggests that: “We all understand that in the eighteenth and nineteenth centuries most Aboriginals were illiterate; they could not read and write in European languages. But then neither could most francophone and anglophone Canadians. … our voting citizens were largely illiterate. Our democratic culture was therefore oral” (126).

In the case of Aboriginal record keeping, “ground-truthing became difficult if not impossible to accomplish because there may not have been Aboriginal individuals able to communicate with the authors of the contemporary record in either English or French” (Interview with Howard E. Grant, Executive Director, First Nations Summit Society, September 2016). According to Mr. Grant, context has also been a major impediment to understanding. For instance, the questions: ‘do you live here?’, and ‘are you from here?’, were often unclarified in the sense of the local (house) or broad (region/neighbourhood) context as understood in aboriginal culture.” Mr. Grant referenced as an example the case of BCCA 487, Docket CA0727336 regarding the Kitsilano reserve lands appropriated by Canada for the development of CP Rail services in Vancouver. There are countless known cases of land appropriation that may be linked to the cultural differences in defining what constitutes occupying and/or owning specific parcels of land. Add to that, cultural perceptions about ownership: an insistence that only historically recent paper records could denote ownership, whereas cultural memory, however lengthy, was not legitimate proof of either occupancy or ownership.

Another major impediment to understanding is the obvious loss of meaning in translation from western languages to Aboriginal ones and vice versa, and the subsequent misinterpretation of context within the translation between different cultural norms. The modern practice of anthropology, used in many legal proceedings through expert testimony is, in many cases, flawed due to its inherently western bias, usually requiring some form of independent corroboration of the oral tradition evidence. Essentially the corroboration of generations of oral record-keepers acting collectively  and collaboratively was not perceived as equal to the written record.

Transfer of knowledge in Pacific Coast Aboriginal cultures was once achieved through observation and social interaction within the home and the community wherein the next generations would ‘absorb’ and learn the cultural heritage and complex government systems of their individual tribe (Interview with Howard E. Grant, October 2016). Further, the Aboriginal oral traditions do not allow for ‘shortcuts’ when their content is shared with subsequent generations: as a result, when Potlaches are held, oral traditions are strictly maintained without deviation (ibid.).

With the advent of western cultures in North America, the ‘Si-yém’ (a Coast Salish term, meaning “respected one(s)”, denoting wisdom, knowledge and experience of the individual(s) so named), understood that change was inevitable and they recognised that the younger population must become educated and acquire the tools and training necessary to replace oral record keeping. This was the beginning of a transition away from dependence on oral records in favour of written ones, however, it is still understood that careful recording is essential to protect their culture and history from misinterpretation by non-aboriginal observers (ibid.).

At present, most aboriginal communities are not necessarily seeking to re-establish an oral culture, rather they are striving to maintain their collective memory as it may be required/useful in litigation, as well as maintaining their unique cultural and historical context. They are also seeking validation of their oral traditions insofar as they constitute the pre-contact/pre-literate record of their culture, there really isn’t a past or present divide – the oral record exists in a continuum.

Reliance on the written record in legal matters is also relatively recent in the long history of civilization, being enshrined in evidence legislation as recently as the late 19th century (Canada Evidence Act 1893). In judicial systems, the slavish reliance on written records as corroboration often neglects the fact that written records can misrepresent or lie on matters of historical fact to benefit one party over the other. Certainly in Canada the necessity for written documentation is at odds with historical practice given the contemporary multicultural nature of the emerging pre-twentieth century Canadian populace and the need “relate to power mainly through the oral” (Saul 2009, 127).

 

Memory Systems: Storage

If the “basic characteristic of the human brain is information processing”, (Wang 2003) then those ancient [e.g. Celtic and Homeric Greek] societies and modern oral-traditional- based cultures such as Canadian First Nations, developed sophisticated oral recordkeeping systems by building on the human brain’s natural capacity. They were not anomalies, inventing for the sake of necessity and were certainly not primitive precursors to the written and electronic record-keeping norms of modern culture. Cultural practices that enhance memory storage, such as storytelling, singing and mnemonics encouraged the development of the human brain to act as the receptacle of both individual and corporate cultural memory.

Current brain research informs our understanding of just how successful the human brain is as a storage and retrieval tool. Some studies in cognitive informatics suggest that the human brain is the most viable model for future generation computer and information retrieval systems that do not employ the brain-as-container metaphor: positing that memory is stored and retrieved in a relational manner (Carr 2010: Wang 2003). It seems that there is a “tremendous quantitative gap between . . . natural and machine intelligence”, with the gap favouring human brain capacities (Wang 2003). As a system, the human brain already discards, stores and retrieves information in a manner that is “more powerful, flexible and efficient than any computer system” (Wang 2003). This is consistent with oral record keeping that allowed for the revision of the story based on new information concerning the corporate cultural memory. This was considered best practice in oral record-keeping.

 

 

You have to begin to lose your memory, if only in bits and pieces, to realize that memory is what makes our lives. Life without a memory is no life at all, just as an intelligence without the possibility of expression is not really an intelligence. Our memory is our coherence, our reason, our feeling, even our action. Without it, we are nothing.
~Luis Beñel

 

 

Simply put, the human brain’s neural pathways encode memories. Every time the brain gets new information it compares it to old information and forms new connections. (Arenofsky 2001: Wang 2003). If our brains were really just containers, all the information we start acquiring as babies, would overflow a rapidly depleting capacity and we would have no more room. The human brain would be a storage facility with no structure or capacity for retrieval. We would all be serious hoarders with limited ability to make sense or use of the memories we had acquired and stored and our users would be tripping over the boxes in the hallway, filled with content that had no discernable relationship. Instead, we are able to use our brains to organize in a relational fashion and thereby make assessable to us the content that we need to make sense of our world and our lives. Just like any good data management system should.
But how does the human brain do what it does so well? How do our brains function and allow the development of such sophisticated oral recordkeeping? What part of the brain is responsible for memory formation and why are memories so important to human creative capacity and technological endeavours? What are the implications for future systems development?

Memory formation and storage is a multi-step and layered activity that takes place in the deeper structures and functions of the brain, most importantly the hippocampus that saves short-term episodic memory and prepares them for long-term storage and the neo- cortex, the long-term storage facility (Hsieh 2012). While there are several memory systems, each serving a different purpose (Bendall 2003), there are two structures coordinating multiple activities between the long-term and short-term memory systems. Within the frontal cortex is the short-term or working memory system, storing new information, while keeping us actively conscious of what is being learned and saved. Within this short-term system is the phonological loop (e.g. silent talking to oneself or learning new words), and the visuospatial sketchpad that makes it possible to manipulate images in our minds. And finally, the central executive system keeps us aware of those short-term memories and coordinates both the sketchpad and the loop. The neo-cortex is, in evolutionary terms, a relatively recent addition to our genetics (Rakic 2009). “If any organ of our [human] body should be [seen as] substantially different from any other species, it is the cerebral neocortex, the center of extraordinary human cognitive abilities.” (Rakic, 724). It is the crux of our creativity and the uniquely human biological innovation that allows us to develop long-term memory storage systems that rival modern electronic records management systems.

We know from studying the development of human infants and children, that experience is crucial to the formation of memory. If you hear something just once the neurons often do not release enough chemicals to make a lasting impression on the formation of neural pathways or on memory retention. But if you associate, for example, a telephone number with a visual image such as a person or a place, or the warm chest of your parent with feeling safe and content, then there is more neuronal activity. The more neuronal activity there is, and the more experiences you have to stimulate that activity, the more likely it is that deep memories will be stored in the long-term memory systems of the neo-cortex (Hermann 1993 and Bendall 2003). Episodic memory (experiences) that may be recalled and played over in the mind is a crucial step in the transfer of that recent memory into long-term memory storage.

The hippocampus has been heavily studied concerning its importance to the formation of long-term memory. “A sea-horse-shaped region tucked deep in the folds of the temporal lobe above the ear” (Carmichael 2004, 50), the hippocampus stores recent episodic memories. Those memories are then rehearsed in our minds and eventually stored as long-term memory (probably during sleep) within the neo-cortex (the outer layer of the brain). It is the repetition and recollection that stimulate the   hippocampus   to   start   the   process   of creating long-term memories that can be later accessed. Children that request the same bedtime stories about the events of their day, over and over, are in effect committing to memory a synopsis of the day’s activities in a process of encoding their family history and committing to memory the deeds of the collective. Watching television or videos is generally a non-experiential activity, therefore the accompanying neuronal activity is much less than reading a book, playing in the sandbox, doing a puzzle or telling stories out loud. The type of activity will determine the degree of information retention in the hippocampus, further initiating the transfer of that experience into memory and determining what is eventually available to the person as accessible knowledge in the form of memories.

 

Your memory is a monster; you forget—it doesn’t. It simply files things away. It keeps things for you, or hides things from you—and summons them to your recall with a will of its own. You think you have a memory; but it has you!
~John Irving, A Prayer for Owen Meany, 1989

 

It is now understood that babies are born without the significant neural pathway development associated with the adult brain. The information and experience children are exposed to will create the neural pathways or synaptic connections in their brains, or, not create them in the case of those who are neglected or abused as youngsters. Since children are not born literate, in the alphabetic sense, they spend these crucial few years as learners in a primarily aurally receptive environment, completely dependent on other people to define and determine the nature of their experiences and therefore the type of memory that will be stored in the neo-cortex. Children’s preferred method of exploring the world and learning is through play, not through rote learning or blanket memorization. Neural pathway development and later ability to store and retrieve information in the adult brain is dependent on children being exposed to a wide variety of repetitive and fun activities that stimulate neuronal activity in the brain and cause the creation of neural pathways.

This is not to say that only children have the capacity to form significant new pathways in their neural networks or that later in life memory formation and big learning are outside the capacity of the adult human brain. In the past it was theorized and held as true that once you were past the baby stage, the brain became fixed, with little capacity to learn new things or heal from trauma. Scientists now know that “the brain is a “work-in – progress” (Arenofsky 2001), with a healthy brain exhibiting the capacity to learn, change, grow and create new pathways until our end date. The human brain is essentially scalable. Like a well-built records management system, the brain can accept new information and accommodate new systems, incorporating and working with them in ways that were originally not anticipated.

 

Memory Systems: Retrieval

The human brain makes connections to what it already knows and decides what to keep. No multiple versions of the same memory file are cluttering up our hard drives and getting us into trouble when we send our boss the wrong version. But, we know that it’s easier to forget information that is new, different or that we don’t care about. We rank memories according to their relevance to us and that relevance is determined by how much we care. People can put to memory large amounts of information to do well on an exam and then forget it all in a short period of time or over the summer break. Even  those with so-called eidetic or photographic memory must care or find relevant the information put to memory or they will not be able to retain it. If one had true eidetic memory they would not be able to prioritize or find meaning in the memories acquired and it would be a huge data dump with no retrieval system to help mediate between the brain and the information (Schmickle 2010). We are instead the inheritors of a very sophisticated “command center” the size of a grapefruit, exhibiting extraordinary powers for storage, retrieval, relevance ranking and forgetting (aka culling, pruning, throwing away) (Arenofsky 2001).

How different is the human brain memory retrieval system from a computer? Is human memory similar to the RAM in a personal computer? (Freed 1997, 1). According to Kate Cambor (1999), “Something memorized by a computer is not the same as something memorized by a human.” For memory systems that store the most stable long-term memories it is not known how much maintenance they require. But, it is unlikely that human memory systems need maintenance in the same way as computer RAM, which basically involves the maintenance of current through a circuit. “There is no bio-electric [sic] system running current through our brains,” rather the “electric nature of the neurosystem comes from interactions between individual neurons and the task of maintaining this electric system requires only keeping the cells alive and keeping them connected to their neighbouring cells” (Freed 1997). In effect, we maintain our brain’s records management system by exercising our capacity for storing and retrieving memories: by experiencing life through all of our senses and exercising our natural capacity to manage information. Of course human memory can be disrupted by accident or disease, but the innate capacity of the human brain to store memory is not in question here. Since the hippocampus is critical for normal memory function (Hsieh 2012), we now know that if we damage or develop disease in that deep brain memory making place, then we have compromised our capacity to create retrievable memories in most systems of the memory making areas of the brain (Bendall 2003).

Human memory creation and long-term storage is admittedly a sophisticated and complicated process: but how is it that we retrieve those memories once they are stored? Human memory incorporates a variety of ‘search fields’ in the form of associational cues, in order to retrieve that information. The brain cross-references those memories through a series of sensory triggers that index the relationships between our experience and; images, sounds, smells, emotions, colours, sensations, intonations that we associate in our memories with that experience. In order to function as an information management system, the brain has developed into a sophisticated structure that relies on retrieval schematics or cues such as mnemonic formulas (memory keys) designed to file and retrieve information/data in the form of memory. The neo-cortex, the records management system that houses those long–term memories functions as a “context- dependent rather than location-addressable memory system” (Marcus 2009).

The associational retrieval cues operating in the human brain are the ‘colour-coded labels’ that trigger retrieval of information from long-term memory storage. For instance, your grandmother’s stories from childhood might be inextricably linked to the cue of the scent of baking scones. The difference between human memory retrieval and computerised retrieval is that the brain always maintains the context of the record wherever it files it in our memory, regardless of how difficult it sometimes is to initiate recall. Hence, the taste of an exceptional wine enjoyed in the company of good friends on a beautiful day can never be repeated in exactly the same way and the smell of baking scones is forever linked to your grandmother.

 

 

Nothing is more memorable than a smell. One scent can be unexpected, momentary and fleeting, yet conjure up a childhood summer beside a lake in the mountains; another, a moonlit beach; a third, a family dinner of pot roast and sweet potatoes during a myrtle-mad August in a Midwestern town. Smells detonate softly in our memory like poignant land mines hidden under the weedy mass of years. Hit a tripwire of smell and memories explode all at once. A complex vision leaps out of the undergrowth.
~Diane Ackerman, A Natural History of the Senses

 

 

Memory records also differ from paper or electronic records in that the majority of memory records are not maintained or even delineated into a verbal format. All of our sensory organs are utilised in the creation of a memory record, and are rarely translated into a verbal or written record – human emotion is the prime example, most notably the sensation of fear. Language is an inadequate form in which to relay the complexity of most human perceptions, therefore the ‘hard copy’ or written form of human memory falls too far short of the actual experience.

The human brain is not homogenous as a records management system, but displays the attributes of flexibility and scalability. While the basic structures are the same for all of us there are differences in how memory retrieval systems work in the individual, either due to neurological differences or sometimes from trauma. Animal scientist Dr. Temple Grandin (2016) suggests that her autistic brain functions much like a search engine.

My brain is visually indexed. I’m basically totally visual. Everything in my mind works like a search engine set for the image function. And you type in the keyword and I get the pictures, and it comes up in an associational sort of way (video).

Grandin is not unique in being a visual thinker: as an autistic person she is an extreme example of that mode of memory retrieval. Like all humans, and despite being completely visual in her thinking, Grandin retrieves her memories in an associational fashion, not as items out of a storage box: but as contextual memory, in her case, in image form. This is what all humans do, whether visual, textual or auditory thinkers. Human memories are stored through the process of experience and association. We experience and then we associate other factors to that experience, further encoding it into our long-term memory system. Associations then become the cues that allow us to retrieve the memory after we have created it. How we retrieve it, what our preferred of default method is, visual, auditory or another sense is up to the unique wiring of our brains.

Oral rhetoricians have traditionally employed systems of ‘aidesmemoire’ or ‘memoria technica’ as generalized codes to improve their all-round capacity to remember and as an aid in oral and later in written composition. Simple rhymes are commonplace in many cultures, especially ones that are meant to help children remember basic concepts: “I before E except after C”. Acronyms and acrostics tend to be confused into one in our contemporary usage, with acronyms especially becoming the language of corporations and government throughout the world: IBM, AWOL. Acrostics for remembering the musical scale and how to spell arithmetic: “Every good boy deserves fudge” and “a rat in the house might eat the ice cream” help us to remember not only the deliberately filed information, but also trigger the retrieval of childhood memories of piano lessons and math tutors. The classic block numeric classification scheme that is employed by librarians and records managers world-wide is actually a grouping mnemonic used to classify lists on the basis of some common characteristic(s). Mental imaging or peg is a method of linking words by creating a mental image of them, such as remembering a grocery list by having the items interact with one another in a bizarre fashion that stimulates short-term memory creation. ‘Loci et res’ as a mnemonic system, involves assigning things a place in a space and can be explored through the science of architectonics (Parker 1977). What they all have in common is how they stimulate the memory systems of the brain to store and allow later retrieval of the information in the form of memory.

While the human brain is capable of making sophisticated associations and computations of diverse philosophical, mathematical and logical construction, computers by way of comparison, can make only limited associations based on their programming. Standard records management retrieval systems are therefore limited in their points of access.

Paper files are usually linked to alpha-numeric storage and retrieval patterns and computer databases have a limited number of search fields which require long- term planning and programming to prevent redundancy before the stored information has completed its active life-cycle. For example, you may design a personnel database,  which includes a Social Insurance Number search field, which may become problematic when privacy issues come to the fore in the field of records management. Human memory, on the other hand, is unlimited in its storage and retrieval patterns. It is elastic – a stored record can be retrieved by a wide variety of methods, some seemingly unrelated to the actual information, for example: ‘déjà vu’.

 

 

Right now I’m having amnesia and déjà vu at the same time. I think I’ve forgotten this before.
~Steven Wright

 

 

The human brain has built-in collocating functions, syndetic structure and some pretty awesome authority control. One might think that relational databases used to store information in a records management system, such as a library database or on the internet would handle information in much the same way as the human brain. However, unless the database is programmed to collocate the records upon retrieval and thereby avoid duplication, then the resulting list will resemble those from a search on the internet. Someone has to tell those little bots to collocate, before they collect.

What the classical mnemonists and modern practitioners are doing when they deliberately design systems for memory aid is to take the process that the human brain naturally goes through to create memory records or metadata and enhance it to an extremely sophisticated level of data storage and retrieval. Remember that hippocampus, dependent on repetition and association to get experience/information into long-term memory. The sensory perceptions (images, sounds, smells, emotions, colours and sensations) that are associated with the creation of memory records in the human brain, become access points for retrieval. The context (who, what, where, when and why) plus the content (data, information, experience) form the metadata of the records management system.

Context= friends, wine, conversation, food Content= debate on the divine nature of Christ
Memory Record/Metadata= names, faces, clothing, the lighting in the room, the paintings on the walls, the smell of wood-fire, the taste of the food

  

Human versus Computer

Despite its recent lack of ‘exercise’, the human memory facility is so well constructed and organised that not even Martha Stewart could improve it. While human memory is also susceptible to viruses such as Alzheimer’s and total system failure such as amnesia, it does not require constant upgrades. Even if the human brain alters through evolution, there are not the same problems with transferring of data between software programs and hardware incompatibility despite language, cultural differences and time: the essence of experience appears to be constant. Technology, however, is rapidly replaced, upgraded and elaborated upon and quite often does not allow for retrospective software compatibility or hardware rewiring. The human memory facility remains constant;  you do not have to plug in any new bits of organic matter to make it go faster, better or more colourful. It is inexpensive, portable, space saving, efficient, dust-free and does not require a battery of info-technicians to help you when you have a glitch. Access to electronically stored information, on the other hand, can disappear with a pop and a fizz of a short circuit a virus, or malicious hack.

Human memory records are filed and retrieved through emotive sensations. They engage all of our senses, as we perceive the content of the message. Can you imagine a computer program giving you the feeling of immense contentment when you have retrieved that little bit of data? Because it is the experiential memory that supersedes the factual memory, when you are told a story, the environment has as much to do with the retention of the memory as the tale itself. Human emotive metadata, combined with modern records management storage and retrieval systems, could result in database retrieval systems that mimic the manner in which the human brain creates multi-layered retrieval cues, based on the senses and the emotions, later employed by the brain to retrieve the information.

It is evident that humans possess an incredibly sophisticated method of creating and storing memories that allows us to be creative thinkers and toolmakers. But, is it not a stretch to suggest that our brain’s capacity for such activities can seriously rival an electronic records management system or that it should be considered the template for future development activities in records and information management or in the field of computer science? Consider this:

In 1973, a Canadian psychologist called Lionel Standing showed volunteers a series of photographs of objects for about 5 seconds each. Three days later, the volunteers were shown paired photographs, one that they had seen before and the other new, and were asked to say which was familiar. Standing increased the number of photographs shown to each person to an astonishing 10,000 and still they managed to identify the ones they’d seen before, with very few mistakes. Although this experiment tested whether they recognised something put in front of them, which is much less challenging than recalling something without any external cue, the results suggest that some aspects of human memory are effectively limitless (Bendall 2003, 1).

Mechanistic storage and retrieval within a limited and compartmentalized human brain was the theoretical blueprint that underlay early computer science and that definition was used to theorize and make assumptions about brain development and capacity (Wang 2003). The recent theoretical shift in cognitive informatics towards developing technology based on the human brain with its infinite possibilities for storage and retrieval, has implications for our understanding of how humans organize information and retrieve it. Our current understanding of the human brain can and should influence how we design better systems that use our brain’s capacity for relational data management and relevance ranking. Essentially the human brain could be the model for a scalable solution to information storage and management.

The authors believe that respect for human memory oral records management systems can be restored through a more thorough understanding of the capacity of all humans to store and manage information. Through scientific understanding we can harness those capacities to enhance and design modern records management systems. We know that, at best computers can store a billion bits of information. Human memory is capable of storing one hundred trillion bits of information (Paris 1990, 121) and making up to 500 trillion possible connections among the neurons of the brain (Arenofsky 2001). As suggested by Cambor (1999), most of us are awed by stories of people like:

The Greek statesman Themistocles, who in the 5th Century BC, is said to have been able to call by name all 20,000 citizens of Athens… [Any computer] could accomplish such a task of ‘memorization’ “without even trying, but who would be impressed? Yet, if a person today did anything analogous, who wouldn’t be? (5)

 

Conclusion

Themistocles needn’t be seen as an anomaly in human brain capacity. Armed with a new understanding of how the human brain has and can be employed in oral records management systems, Themistocles can represent that which is largely forgotten or underutilized: the potential of the human brain trained to efficient and effective storage and retrieval.

A computerized database is impersonal in that you may not be the one to plug in the information, but you are the one to retrieve it. You are not necessarily a participant in the creation of the record that is to be retrieved. Databases of the future could allow the creator and the user to interact with one another in the creation of living records that are altered or enhanced with each retrieval or storage of new information. External storage and retrieval systems may benefit enormously from a more extensive examination of human memory with a view to developing systems that are more intuitive and responsive to natural human processes. Human memory is akin to data in five dimensions, layered with perceptions from each of the senses, creating a comprehensive and experiential understanding of the information. It is likely, with the acceleration of technology: that innovations that incorporate some of these factors will emerge in the near future. We already have artificial intelligence and virtual reality enhancements available in the marketplace. It is not such a grand step forward to envision a more comprehensive and experiential human interaction with data through technology.

 

Works Cited

Canada Evidence Act. R.S.C., 1985, C. C-5.
“Interview with Mr. Howard E. Grant.” Interview by Sandra M. Dunkin. Sept., and Nov. 2016.
Tsilhqot’in Nation v. British Columbia, 90-0913 BCSC 1700, 36 (2007).
Arenofsky, Janice. 2001. “Understanding how the Brain Works.” Current Health 1 24 (5): 6-11.
Bendall, Kate. 2003. “This is Your Life…” New Scientist 178 (2395): S4. Cambor, Kate. 1999. “Remember This.” The American Scholar (Autumn). Carmichael, Mary. 2004. “Medicine’s Next Level.” Newsweek 144 (23): 50. Carr, Nicholas G. 2010. The Shallows. New York: Norton.
Connelly, Jim. 1995. “Designing Records and Document Retrieval Systems.” Records Management Quarterly (April).
Connolly, John. 1996. “You must Remember This.” Sciences 36 (3): 2.
Freed, Michael. 1997. “Re: Is Human Memory Similar to the RAM in a PC?” Aerospace Human Factors, NASA Ames Research Center, last modified January 6, accessed September 6, 2016, http://www.madsci.org/posts/archives/1997- 03/852177186.Ns.r.html.
Gaidos, Susan. 2008. “Thanks for the Future Memories.” Science News 173 (19): 26-29.
Temple Grandin on Her Search Engine. 2016. Animated Video. Directed by David Gerlach. PBS Digital Studios: Blank on Blank.
Hermann, D. 1993. Improving Student Memory. Toronto: Hogrefe & Huber. Herodotus. The Histories, edited by Aubrey de Selincourt. 1976. New York: Penguin Books.
Homer. Translated by Robert Fitzgerald. 1963. The Odyssey. New York: Anchor Books.
Hsieh, Sharpley. 2012. “The Language of Emotions in Music.” Australasian Science 33 (9): 18-20.
Kelber, Werner. 1995. “Language, Memory, and Sense Perception in the Religious and Technological Culture of Antiquity and the Middle Ages.” The Albert Lord and Milman Parry Lecture for 1993-1994. Oral Tradition 10 (2): 409-45.
MacManus, Seumas. 1967. The Story of the Irish Race. New York: The Devin Adair Company.
Marcus, Gary. 2009. “Total Recall: The Woman Who Can’t Forget.” Wired 17 (4). https://www.wired.com/2009/03/ff-perfectmemory/
Parker, Rodney. 1977. “The Architectonics of Memory: On Built Form and Built Thought.” Leonardo 30 (2): 147.
Paris, Ginette. 1990. Pagan Grace: Dionysos, Hermes, and Goddess Memory in Daily Life. Dallas: Spring Publications.
Plato. translated by Harold N. Fowler. 1925. Phaedrus from Plato in Twelve Volumes, Vol. 9. London: William Heinemann Ltd. http://www.english.illinois.edu/-people-/faculty/debaron/482/482readings/phaedrus.html
Rakic, P. 2009. Evolution of the neocortex: Perspective from developmental biology. Nature Reviews. Neuroscience, 10(10), 724–735. http://doi.org/10.1038/nrn2719
Saul, John Ralston. 2008. A Fair Country: Telling Truths about Canada. Toronto: Viking Canada.
Schmickle, Sharon. 2010. “Why You Don’t Want the Dragon-Tattooed Lady’s Photographic Memory.” MinnPost.Com, July 8.
Wang, Yingxu. 2003. “Cognitive Informatics: A New Transdisciplinary Research Field.” Brain and Mind 4 (2): 115-127. doi:1025419826662.

Records Management in Canadian Universities: The Present and the Future

By Shan Jin, MLIS, CRM, CIP

 

Introduction

This article presents findings from in-depth interviews with twenty-six records managers, archivists and privacy officers who work in twenty-one Canadian universities. It provides a comprehensive view on current records management practices in Canadian universities. The main topics include program staffing, program placement, records retention schedules and classification schemes, physical records storage and destruction, university records centre, Electronic Document and Records Management Systems (EDRMS), training, outreach and marketing. It also examines the relationships between the records management program and internal stakeholders and identifies the needs for knowledge sharing and collaboration in the academic records management community in Canada.

 

Literature Review

In both Canada and the United States, modern records management started from the federal government. Records management, as a professional management discipline, has been established for more than sixty years (Langemo 2; Fox 1). However, only a small number of scholarly articles were written on records management programs in the higher education environment in North America and even fewer focus on Canadian universities.

From early days, university archival programs often assumed responsibility for records management (Saffady 204). Until recently, many universities’ records management functions still largely resided with the archivist (Zach and Peri 106). From 1990 to 2010, several studies on academic records management programs were conducted by researchers using surveys and interviews. Some were large-scale studies, such as Skemer and William’s 1990 survey on the state of records management whose findings were based on responses from 449 four-year colleges and universities in the United States. Twenty years later, Zach and Peri conducted updated research on college and university electronic records management programs in the United States. Their article presented findings from their 2005 online survey of 193 institutions and interviews in 2006 with 22 academic archivists as well as their 2009 online survey of 126 institutions. Although the focus of these two studies was not on Canadian universities, they provided some comparable data that are referenced in this article.

There were some small-scale studies which complemented the Zach and Peri research. Schina and Wells’ 2002 survey of fifteen American institutions and fifteen Canadian institutions provided relevant information from more than a decade ago, which is cited in the findings section of the article. Furthermore, there were two comparative studies that presented historical information on the records management programs in the University of British Columbia and Simon Fraser University (Brown, et al. 1-20; Külcü 85-107).

Higher education institutions have unique organizational structures and institutional cultures and traditions, which affect how records management programs operate within a university. Since there is a lack of comprehensive studies on records management programs in Canadian higher education institutions, this study will help to fill a research gap.

 

Research Scope and Methodology

Universities Canada (formerly known as the Association of Universities and Colleges of Canada) has ninety-seven member colleges and universities. Since it would be difficult to collect information from all of these universities over a short period of time, the author used a sampling method to decide the criteria for selecting participating universities for the study.

A quick email survey was sent to the records managers, archivists, or privacy officers of twenty Council of Ontario Universities (COU) members. The author asked these universities if they had a formal records management program with at least one employee who worked on records management for a minimum of fifty percent of his or her time. As demonstrated in the survey responses none of the small Ontario universities (with less than 10,000 students) had such a records management program (see table 1). Based on this finding, the author decided that eligible universities for this study would be those with an enrolment size of at least 10,000 students because those are more likely to have a formal records management program.

 

Table 1

Due to limited resources for the study, the author chose to collect data using individual interviews instead of large-scale surveys. Between April 2015 and January 2016, thirty potential participants were contacted via email with a cover letter and a consent form and invited to participate in the study. Eventually, twenty-six records managers, archivists, and privacy officers from twenty-one publicly-assisted Canadian universities agreed to be interviewed. Table 2 lists the number of participating institutions by province.

 

Table 2

Upon receipt of the consent forms from participants, an in-depth 90-120 minute interview was scheduled with each participant. A questionnaire was sent to them ahead of the scheduled interview so they could prepare for it. Interviews were conducted with each participant in three ways: face to face, by telephone or using video conferencing technology. An audio recording was made with the permission of each participant. Eight site visits were also made during the same ten-month period. Additional information was gathered from email follow-ups and from the web sites of the participating institutions. To protect the anonymity of participants, findings of this study reflect group results and not information about specific individuals or universities, with the exception of publicly available information.

 

Findings and Common Concerns

Program Staffing

The study looked at the educational level of the persons responsible for the records management programs. Eighty-eight percent of the twenty-six participants have one or two master’s degrees in library and information studies, archival studies, or history. Thirty-eight percent of the twenty- six participants were hired or moved to their current records management related positions in the last three years. The data gathered from the interviews are listed in table 3. It is shown that the bigger the student enrolment size of the university, the higher the full time equivalent (FTE) number of its records management unit personnel.

 

Table 3


The author also asked the participants the percentage of their time that was devoted to records management related duties, the responses show that many of the participants of this study have responsibilities in areas other than records management. On average, they spend 67% of their time on records management.

 

Records Management Program Administrative Placement

Unlike government agencies and private companies, Canadian universities often have a shared governance system. The academic side directly supports teaching, learning and research functions, and the non-academic side supports administrative functions. Early university records management programs often reported to university archives, an academic unit that is usually a part of the university libraries. Data collected from the interviews show records management programs established in the last decade are moving away from university archives and libraries, and report to a senior administrative department, such as University Secretariat and General Counsel.

Eighteen out of the twenty-one universities participated in this study each have a formal records management program. All five newer programs (<10years) report to an administrative unit.

Older records management programs (>=10 years) have a split, with six reporting to a senior administrative department, seven reporting to an academic department. In total, 61% of the eighteen records management programs report to a senior administrative unit, the rest report to an academic unit (see table 4).

 

Table 4

 

All participants of the study shared their thoughts on the pros and cons of both placements. As summarized in table 5, both reporting structures have their strengths and weaknesses. Archivist William Maher provided some interesting insights in a discussion on academic archives’ administrative location in his book – The Management of College and University Archives.
Maher pointed out there was “no single location that is best for all purposes” (23). He continued to say that too often “attention to the question of location is driven by dissatisfaction with limits imposed by the current parent department and the hope that some other parent would provide better support” (23). Although Maher was talking about archivists’ opinions on academic archives’ administrative location in the hierarchy of the college or university, it seems participants of this study have a similar mentality when it comes to the discussion of records management program’s organizational placement. Regardless of where the records management program is located, the author believes that records managers must capitalize on the advantages and overcome the disadvantages of its organizational structure in order to seek ways to improve records management services. It is important to align efforts from the records management program with other strategic partners such as archives, Information Technology (IT) security, legal department, privacy and compliance office, etc.

 

Table 5

 


Records Retention Schedule and Classification Schemes

 Records retention schedules and classification schemes are the basic component of a sound records management program (Kunde 190). All participating universities that have a formal records management program have established a classification scheme. According to participants of this study, developing records schedules is an ongoing task. Common records schedules are a priority because these schedules are used by all university departments.

Records retention schedules drafting processes in Canadian universities are very similar from one university to the other, but final approval processes vary dramatically. Records schedules are:

  • Approved by a University Records Management Committee;
  • Signed off by a records director, or the president of the university, or a non-records management specific senior committee;
  • Not formally approved by any group in the In Québec, the Archives Act requires that: every public body shall establish and keep up to date a retention schedule determining the periods of use and medium of retention of its active and semi-active documents and indicating which inactive documents are to be preserved permanently, and which are to be disposed of (3).

Also, the Act requires every public body to, “in accordance with the regulations, submit its retention schedule and every modification of the schedule to Bibliothèque et Archives Nationales for approval” (3). Such a process takes a long time; however, the biggest advantage is that the schedules become law. Going through the provincial government approval process gives the records schedules more validation, and compliance to schedules is mandatory in Québec.

In provinces outside Québec compliance to records retention schedules is the responsibility of individual offices and is voluntary. Based on the study findings university records managers often take on advisory or assistance roles. It is not their mandate to be the records management police, for instance, enforcing compliance to retention schedules at a departmental level, but university records managers can encourage compliance by:

  • Defining roles for department/unit heads and staff in records management policy;
  • Providing training and creating tools to assist employees with records management tasks;
  • Using persuasion to encourage employees to use records retention schedules and classification schemes to manage records; and
  • Setting up a departmental records management coordinators network for better communication.

 

Physical Records Storage and Destruction Services

Canadian universities often have a decentralized budget model. When it comes to records storage and destruction, each department or unit is likely to adopt a self-managed solution, but some universities still make an effort to provide a central or a hybrid solution.

Data collected from the interviews indicate that:

  • Four universities have set up a University Records Centre or use a commercial facility for records storage. All activities are monitored by the records management program;
  • Six universities have a hybrid solution whereby departments and units can choose from using a centrally managed storage service or managing records on their own;
  • Many universities use policies and preferred vendors to regulate records storage and destruction activities on campus;
  • Fifteen out of the twenty-one universities (71%) have developed records destruction procedures to formalize records destruction activities;
  • Fourteen out of the twenty-one universities (67%) have preferred shredding service provider;
  • Only two universities have total control of records destruction on campus, and destruction activities are carried out through their University Records

Most of the records destruction activities are self-managed by departments (see table 6).

 

Table 6

 

The responses from the participants of this study indicate that managing physical records is still a major responsibility for university records managers. Despite the decentralised nature of a university’s organizational structure and budget model, the author believes records managers should seek an extent of central control over physical records storage and destruction.

 

University Records Centre

Building or creating a university records centre is one way to gain central control over storage of semi-active records and destruction of inactive records. In Skemer and Williams’ 1990 study, the percentage of American universities which provided records centre storage is 52% (542). Data collected from the interviews show eight out of the twenty-one (38%) universities that participated in this study have their own University records centre or records storage facility.

Table 7 shows some of the services provided by these Canadian university records centres.

 

Table 7

Most of the participants of this study did not feel confident to tackle the daunting task of managing electronic records without the right software tools. One of the best solutions to manage electronic records is EDRMS, which is designed to facilitate the creation, management, protection, security, use, storage and disposal of a range of both physical and electronic documents.

According to responses from the interviews, six out of the twenty-one universities (29%) are providing some degree of central software solution to manage electronic records. However, there are many challenges, especially for a highly decentralised organization like a university. Here are some common ones:

  • Available solutions are too expensive;
  • Lack of a central approach;
  • Offices are more interested in business automation, so records retention and disposition are not their major concerns; and
  • Records management functionality is often overlooked when records management staff only play an advisory role in an EDRMS

One successful EDRMS project is Concordia University’s eDocs project which has been ongoing since 2013. They were able to secure central funding through their Vice-President of Development and External Relations and Secretary-General (Peacock). The solution is free to all faculty and staff but adoption is voluntary. The key to the success is that the EDRMS project is co-led by their records management department and IT department. The project had five full  time employees according to the Records Management and Archives department’s organizational chart from 2015:

  • one Project Manager,
  • two Business Process Analysts,
  • one Change Management Lead
  • and one Archivist / Records Officer.

Phase I of the e-Docs project was completed in 2015 with the application being installed in 300+ users’ desktop computers. Phase II will expand to 500+ users. The EDRMS replaces shared drives and it has a records classification plan embedded in the system. This success story reaffirms that a successful EDRMS project must have high-level support and cooperation from the office of IT or a unit of equivalent function (Zach and Peri 122).

As Kunde suggested, records managers need to engage in activities that position them to be more active partners in managing an institution’s information resources, particularly those that are in an electronic format (189). In order to gain a better control over electronic records, the author believes records managers, privacy officers, archivists and IT should work together to offer some central solutions. For example, this might involve:

  • Standardizing processes for any future EDRMS projects;
  • Establishing a cross-functional team approach to implementing an EDRMS mandatory; and
  • Providing joint training programs with IT on best practices of managing electronic documents and

 

Training

Participants in this study are providing training using both traditional methods and new technologies. There are formal classroom training sessions and less formal information sessions, such as Lunch and Learn. University records managers usually have easy access to many learning management systems. Therefore, they can explore web-based training programs. Many learners prefer this format because they can go online and learn at their own pace. Some of the participants mentioned they started to use multimedia technologies, such as YouTube, Webcast, and Podcast for training.

The content of records management training varies. Records managers choose the content according to the needs of the audience. It can be an introduction to records management, or advanced courses on implementing a file plan, email management, and managing shared drives. Records managers also target different audiences, including senior management, office administrators, records management coordinators, and new employees. Many participants talked about making training a joint effort of the records management program, privacy office and information security office.

Schina and Wells pointed out that systematic training is the key to the success of a records management program (48). The author believes that university records managers should try their best to allocate staff and time to train employees with records management duties.

 

Outreach and Marketing

A records management program hidden from public view is often misunderstood and forgotten by the very people on campus who rely on its services (Purcell 134). The records management program needs to raise awareness among employees and improve its visibility. To achieve this goal, records management staff need to utilize all resources available. This might include:

  • Setting up a records management coordinators network;
  • Having regular meetings with senior management;
  • Generating good word-of-mouth through interactions with employees; and
  • Providing useful tools on the records management web

Through this study, the author has learned some creative methods colleagues in other universities have adopted to market their records management programs, such as using records management- themed coasters, mouse pads, and fortune cookies with records management tips inside etc. A Records Management Day with free pizza turned out to be an effective way to boost employee morale and raise awareness about the records management program.

The author thinks records managers need to find ways to reach employees and promote the records management program, for example, establishing a records administrators group, attending campus events and putting articles in campus newspapers.

 

Records Management Program and Internal Stakeholders

RM and Privacy

The author learned from this study and work experience that Ontario’s Freedom of Information and Protection of Privacy Act (FIPPA) has had a great impact on records management programs in Ontario universities because the basis of the Act is the right to access information held by public bodies, and the right of access depends on the appropriate management and preservation of records (“FIPPA and MFIPPA” 1). In Ontario, four out of the nine university records management programs were launched after Ontario universities were brought under the FIPPA legislation in June 2006. The newest records management program was the direct result of a change in FIPPA in December 2014, which requires every head of an institution take reasonable measures to preserve records in the custody or under the control of the institution.

Participants of the study from outside Ontario also mentioned the great impact of their provincial access and privacy legislation on their own records management programs. Similarly, Skemer and William’s 1990 survey revealed that state law and regulations were important reasons for the creation of records management programs in publicly supported institutions in the United States because legal pressure was probably more persuasive in colleges and universities that rely principally on public financing (537; 545).

Records management programs and privacy offices often have a collaborative relationship. According to data gathered in this study, seven out of the eighteen (39%) records management programs report to the same senior administrative office as their university’s privacy office.

Twenty-seven percent of the participants have both records management and compliance support duties. Many participants of this study agreed that the FIPPA legislation was a strong driving force for records management programs. The records management program and the privacy office share the same goal of educating people on best records management practices, but with some different emphases.

 

Rm and Archives

Many Canadian universities’ records management programs emerged from university archives. Data from this study show eleven out of the eighteen (61%) university records management programs are placed in the university archives or in the archives and records management joint department. Forty-two percent of the participants have both records management and archives duties.

University records management staff often seek input from archivists regarding records retention and disposition. Archivists understand the value of the records management program to their own program; records management staff act as advocates for the archives, and make sure archival records will be transferred to archives. In the author’s view, the records management and the archives programs are natural allies who share common interests. This is especially true when university archives have the mandate to collect institutional records. As Purcell pointed out a strong relationship between the records manager and the academic archivist is a sign of a successful records management program (134).

 

RM and IT

In their 2002 article, Schina and Wells stated both U.S. and Canadian university records managers wished to develop a closer relationship with their IT colleagues and participated in electronic records management decisions (43). Fourteen years later, it remained a concern for the participants of this study. During the interviews, many mentioned that there was a disconnect, and sometimes, miscommunication between the two departments. For example, IT staff and records managers may understand the word “archiving” very differently. Many participants said when IT was leading an EDRMS project, such a disconnect and miscommunication often led to exclusion of the records management staff from EDRMS initiatives on campus. Sometimes, records management became an afterthought.

In order to improve this situation, the author thinks record managers should try to build a mutually beneficial relationship with IT. The two units do have some common interests, including EDRMS, cloud computing, and information security. Records managers need to show what the records management team can bring to the table. For example, records management professionals are experts on data retention, advocate reducing server storage space and eliminating duplication. When IT sees the potential benefits of collaborating with the records management team, it is more likely that they will be willing to work with records management staff on EDRMS related projects. There is a need to align records management efforts with IT efforts.

In summary, experts from records management programs, archives, privacy offices and IT departments can bring their unique perspectives into conversation about managing a university’s information assets. It is important to create and maintain strong partnerships among all these internal stakeholders in order to achieve the university’s overall goals of information governance.

 

 Communication and Collaboration (External)

Kunde pointed out that with the reality of small records management programs and low levels of staffing efforts to collaborate can benefit everyone (205). During the interviews, many participants expressed their interest in seeking collaborative opportunities with colleagues at other universities.

Universities in the same province are governed by the same laws and regulations. In the author’s opinion, there are opportunities for knowledge sharing and collaboration among university records managers. In British Columbia, four universities started to have monthly teleconference calls in 2015. In Ontario, the Council of Ontario Universities (COU) provides a communication platform based on the existing Records Managers Group web portal and mailing list. Records managers of COU members also started to have regular teleconference in spring 2016. These activities indicated a good start.

Any collaboration initiative needs a strong leadership, funding and human resources. There are good examples from the academic library communities, such as the Ontario Council of University Libraries’ Collaborative Futures project. Collaborative Futures maximize the existing expertise and resources among Ontario academic libraries. One of the projects is to manage and preserve print resources. Five Ontario universities (McMaster, Ottawa, Queen’s, Toronto and Western) undertook the project to consolidate physical library materials into one shared space at University of Toronto Libraries’ Downsview property. The Downsview facility implemented a High-Density Racking System and Mechanized Retrieval to ensure orderly retrieval of low-use print material (“Improving” 8). Access to the entire collections is provided by an online request service, supported by a daily courier (“Welcome”). Such a model can be used for storing physical university records as well. For example, the University of British Columbia’s Library PARC is a similar high-density storage facility, but it is used for both low-circulation library collections and university records (“Library Parc”).

University records managers can also learn from the collaborative efforts of Ontario municipalities. In the 1990s, Ontario’s Municipal Freedom of Information and Protection of Privacy Act (MFIPPA) came into effect. One company, the Information Professionals, worked in conjunction with The Association of Municipal Managers, Clerks & Treasurers of Ontario (AMCTO) and developed TOMRMS (The Ontario Municipal Records Management System).

TOMRMS is a model system for managing records with a classification and indexing scheme, retention schedules and citation tables which refer to applicable laws. As of 2016 TOMRMS has been used by about one hundred Ontario municipalities. Can smaller academic institutions with newly established records management programs collectively outsource the task of developing classification schemes and records retention schedules? Eventually it is up to the university community to seek common interests and start the conversation.

 

Conclusion

Canadian universities have been practicing records management for more than half a century. The University of Toronto was among the earliest English-language Canadian universities which established a records management programs. In 1989, the University of Toronto’s president passed the Presidential Regulations for the Management of Archives and Records, which marked the launch of its records management program (“About the Records Management Program & Services”). Today, larger institutions like University of Toronto generally have an established records management program, while smaller institutions only started to develop their own records management program in the last decade due to legal requirements. Provincial access and privacy legislations often have had a significant influence on academic records management programs because the right of access information depends on the appropriate management and preservation of records. (“FIPPA and MFIPPA” 1).

Although data were collected from only twenty-one academic institutions (a sample size of 21.6% of all Canadian colleges and universities), the results of this study can still reveal some general patterns in current recordkeeping practices in public Canadian universities such as: a clear trend of relocating the records management program outside an academic unit, such as the university libraries, which mainly supports teaching, learning and research. Among the records management programs surveyed in this study, about 28% of them were established in the last decade, all of which are reporting to a senior administrative office. In total, 61% of the programs are located in a non-academic unit. Many believe that placing the records management program in a senior management unit can raise its profile as the university-wide records management decisions made by senior administration are often perceived as having more authority.

Most Canadian universities with limited resources can only provide basic records management services. Records management programs mainly play an advisory role in a highly decentralised university environment. None of the participating institutions’ records management programs have a mandate to carry out records management audits in order to ensure compliance with approved records retention schedules and other internal records management policies. Adoption of records retention schedules is often voluntary. However, the unique schedule approval process in Québec makes compliance to records retention schedules mandatory in that province.

Although universities often operate in a decentralized fashion, records managers are making an effort to provide a degree of central control by developing common records retention schedules and campus-wide records management policies and procedures, and providing centralised or semi-centralised records management solution for both physical and electronic records. However, implementation of EDRMS solutions is still in its infant stage. Many universities simply do not have the human and financial resources to run an effective, university-wide electronic records management program.

The study also shows there is an increasing need to build and enhance connections among records professionals within the academic records management community. Today’s academic records managers are facing the same challenges and working with limited resources. The future of continuous success in university records management programs requires records professionals to actively explore solutions to bring local records management expertise together for knowledge sharing and collaboration.

 

Works Cited

“About the Records Management Program & Services.” University of Toronto Archives and Records Management Services, University of Toronto. utarms.library.utoronto.ca/about-utarms/rmps. Accessed 15 Oct. 2016.
Archives Act. c. A-21.1. Québec. Bibliothèque et Archives nationales. Government of Québec.
1 Nov. 2016, legisquebec.gouv.qc.ca/en/pdf/cs/A-21.1.pdf. Accessed 16 Aug. 2016. Brown, Helen, et al. “Records Management in the University of British Columbia and the
University of Pennsylvania.” UBC Social Ecological Economic Development Studies (SEEDS) Student Reports. 25 Nov. 2010, pp. 1-20. open.library.ubc.ca/cIRcle/collections/undergraduateresearch/18861/items/1.0108622. Accessed 15 Oct. 2016.
“FIPPA and MFIPPA: Bill 8 – The Recordkeeping Amendments.” Guidance Documents, Information and Privacy Commissioner of Ontario, 22 Dec. 2015, www.ipc.on.ca/wp-content/uploads/Resources/Bill8-New-Recordkeeping- Amendments.pdf. Accessed 15 Oct. 2016
Fox, Uta. “The History of Records Management in Canada, 1867-1967.” Sagesse, Journal of Canadian Records and Information Management, Spring 2016, pp. 1-11. www.armacanada.org/index.php?option=com_docman&view=download&alias=43-the- history-of-records-management-in-canada&category_slug=canadian-rim&Itemid=458. Accessed 14 Aug. 2016.
Freedom of Information and Protection of Privacy Act. R.S.O. 1990, c. F.31. Ontario. Minister of Government and Consumer Services. Government of Ontario. 2016. www.ontario.ca/laws/statute/90f31. Accessed 8 Aug. 2016.
“Improving Efficiency at Ontario Universities: A Report by the Council of Ontario Universities.” Council of Ontario Universities, Dec. 2015, cou.on.ca/wp- content/uploads/2015/12/COU-Improving-Efficiency-at-Ontario-Universities- Dec2015.pdf. Accessed 15 Oct. 2016.
Külcü, Özgür. “Records Management Practices in Universities: A Comparative Study of Examples in Canada and Turkey.” The Canadian Journal of Information and Library Science, vol. 22, no. 1/2, 2009, pp. 85-107. Accessed 22 July 2016.
Kunde, Nancy M. “Reframing Records Management in Colleges and Universities.” College and University Archives: Readings in Theory and Practice, edited by Christopher J. Prom and Ellen D, Swain, Soc. of American Archivists, 2008, pp. 185-208.
Langemo, Mark. Winning Strategies for Successful Records Management Programs. Information Requirements Clearinghouse, 2002.
“Library Parc.” University of British Columbia Library, about.library.ubc.ca/changes/libraryparc/.
Accessed 15 Oct. 2016.
Maher, William J. The Management of College and University Archives. Soc. Of American Archivists, 1992.
“Organizational Chart: Records Management and Archives.” Concordia University, 5 May 2015, www.concordia.ca/content/dam/concordia/offices/vpdersg/docs/RMAD-org-chart.pdf.
Accessed 15 Oct. 2016.
Peacock, Tom. “eDocs project promises improved electronic documents management.” News Stories, Concordia University, 19 June 2013, http://www.concordia.ca/cunews/main/stories/2013/06/19/edocs-project-promises- improved-electronic-documents-management.html. Accessed 15 Oct. 2016.
Purcell, Aaron D. Academic Archives: Managing the Next Generation of College and University Archives, Records, and Special Collections. Neal-Schuman, 2012.
Saffady, William. “A University Archives and Records Management Program: Some Operational Guides” College and Research Libraries, vol. 35, no.3, 1974, pp. 204-210.
Schina, Bessie and Garron Wells. “University Archives and Records Programs in the United States and Canada.” Archival Issues, vol. 27, no.1, 2002, pp. 35-51.
Skemer, Don C. and Geoffrey P. Williams. “Managing the Records Higher Education: The State of Records Management in American Colleges and Universities.” American Archivist,
vol. 53, no. 4, 1990, pp. 532-547.
“Welcome to UTL at Downsview.” University of Toronto Libraries, onesearch.library.utoronto.ca/downsview. Accessed 15 Oct. 2016.
Zach, Lisl and Marcia Frank Peri. “Practices for College and University Electronic Records Management (ERM) Programs: Then and Now.” American Archivist, vol. 73, no.1, 2010, pp. 105-128.

 

From Promise to Fulfillment The 30-Year Journey of Electronic Recordkeeping Technology

By Bruce Miller

 


This is the inside story of the development of electronic recordkeeping software technology from its inception to its current capability and status today. I don’t want this to be just a history lesson, but an objective and factual story of what worked, what didn’t work, and where it has led us. I’ll also try my best to predict where we need to go in the future, based on where we are today. This story contains plenty of my opinion, and please remember, it’s just my opinion! I hope you enjoy the story….
– Bruce Miller

 

I see three distinctive phases, or stages, of electronic recordkeeping technology development to date. They are:

  • e-Records Software: 1983 – 2002. The birth of electronic recordkeeping software. The early, uncomfortable pioneering days, and the introduction of a US software standard that forever changed the marketplace.
  • Document Management: 2002 – 2012. This is the period when the large, leading document management software firms made their move to incorporate recordkeeping capabilities, and changed not just the technology, but how such solutions were delivered to the marketplace.
  • Microsoft: 2012-present. As a single company, Microsoft’s pervasive presence on our business systems across the globe means its influence is, to put it mildly, outsized. Microsoft’s SharePoint ECM platform began to gain significant ground on the traditional ECM providers. This stage marks the period when Microsoft finally realized that, like the established ECM players, they too needed to deliver recordkeeping capability. While their competitors already had well-established recordkeeping capabilities, Microsoft moved to make recordkeeping a fundamental cornerstone of SharePoint. For better or for worse, their move led to massive changes in recordkeeping software technology, and ultimately to what I believe has been a ground-breaking innovation that broke through a key barrier that held us back in the earlier stages.

 

The Three Stages of the Market

e-Records Software

This stage begins in 1983, when a few of Canada’s federal government agencies, led by a group known as the Canadian Workplace Automation Research Centre (CWARC), awarded a handful of fairly significant grants to up-and- coming technology firms to innovate new technology for “office automation”. Each grant was sponsored by a different federal department. The grants formed part of a contest of sorts – each grant would fund a proposal for a research and development project. Each resultant proposal would be judged and a winner selected. The winner would receive an additional grant from the government to hopefully spur the commercialization of the proposed technology. I was one year out of school, and working for a Montreal-based firm building the world’s first portable computer, Bytec’s Hyperion, in Ottawa.

This firm was a recipient of a grant sponsored by National Archives of Canada (NAC) and the Department of Communications (DOC). DOC wanted to stimulate technology industry growth. NAC wanted new technology capability to manage electronic records. It fell to me to put together a proposal for a new technology, so I formed a team of young software specialists from my company, and key stakeholders from NAC and DOC. NAC would take the lead on defining requirements. DOC would be an example of a federal government consumer of the new technology, and help with requirements definition. We called it Foremost (Formal Records Management for Office Systems Technology). Our proposal won the contest, but by the time any formal project could be formed, the company I was working for collapsed around us. The future was bleak, but we had this great idea for new software.

My team and I decided to form a new company of our own to build the technology we proposed. We each threw in what small amount of cash we could muster, and we had ourselves a company. We secured venture funding to sustain our operations while we set about building this new software. NAC and DOC held out hope we could somehow, someday, eventually deliver a commercial product they needed, so they continued giving us guidance and direction on features we needed to offer. The software gradually grew to the point where we could actually deliver an early version of it. Meanwhile, a local Ottawa-based software consulting firm joined the race by developing a competitive product. By now we were also hearing of a second competitor out of Australia.

Undaunted, we continued to grow our company, and NAC tested and validated the software. We were beginning to receive much attention throughout the USA, along with increasing interest around the world, mostly from government agencies who were realizing that more and more of their records were in electronic form, yet they had no way to apply their obligatory recordkeeping control to these records. NAC Director John Macdonald (now retired), was a key internal champion of the project, and he undoubtedly influenced archives departments of other countries with his efforts to push for this type of technology. It has been said that through Macdonald, NAC’s work became a key factor in the International Council of Archive’s (ICA) eventual standard for electronic recordkeeping.

We were a typical small software start-up in so many ways. Everything we did was “the first” – nothing like this had ever been built before. We had competitors; however we knew we were way out there on the innovation curve. We steadily developed the product with new releases every year. We were starting to get a few sales from early adopters, but no sales of sufficient size to make the company self-sustaining, let alone a financial success. We were getting a great deal of interest and promise, including from the US military and large corporations, but many potential buyers seemed reluctant to buy from a small Canadian software start-up.

Before I knew it, nearly a decade had passed. Our competitors were growing stronger and adding features, as were we. Our advantage however was venture funding – we had sufficient capital to grow a strong development team focused on building innovative new features, without having to wait for sales revenue.

Then the US Gulf war happened. Returning war vets complained of a never-before seen illness that came to be known as “gulf war syndrome”. What does gulf war syndrome have to do with electronic recordkeeping? A lot, as it turned out. Gulf war vets launched a class action lawsuit against the US military for compensation. I don’t recall the precise outcome of the litigation however I know it cost the US military a great deal of taxpayer’s money. The Pentagon subsequently concluded that the failure to recognize the disease was caused in large part by the destruction of soldiers electronic medical records on battlefield computers. In an effort to prevent the loss of valuable electronic records in the future, they mandated that all US military agencies must now manage their electronic records properly. In an effort to hasten the transition to electronic recordkeeping, they established a mandated software standard — US DoD-5015.02-STD that specified that all records management software systems used to store electronic records must meet the requirements laid out in this standard.

The pentagon went further. They boldly decided that they would form a testing agency to certify vendor software products to ensure the software met the stringent new standard. This new testing body was to be housed in Fort Huachuca AZ, under the auspices of the Joint Interoperability Test Command (JITC).

Over time, this standard turned out to be a complete market game changer. Now the US military, arguably the largest consumer of recordkeeping technology in the world, could only buy technology that met this new standard. The US National Archives and Records Administration (NARA) embraced the standard and recommended that all US government agencies comply with it. This set off a chain reaction of “me-too” behaviour. Before long, plenty of US state and local government agencies adopted the standard. Even many corporations jumped on the bandwagon.

Their Requests for Proposals (RFPs) to purchase recordkeeping software listed 5015.02 compliance as mandatory. This would eventually spread to Canada and Europe. I estimated at the time that nearly half of all Requests for Proposals (RFP) for recordkeeping software listed 5015.02 compliance as mandatory.

There was just one problem.

There was no software to test. No software had ever been written to this new standard. I and my team provided substantial input to the team developing the standard; however JITC was very careful to remain neutral of any single vendor. I remember many long debates with the team about what was possible versus what was practical. The resultant standard was a very daunting list of capabilities never seen before. I felt our new software met the bulk of the core requirements, but a great deal of work would be needed to meet the standard. The race was on to get certified against the new standard. And we were not the only company – everyone in the recordkeeping software business now had to get certified, or risk losing key market share.

I knew this was an all-or-nothing gamble for our company – we had to do it. So we doubled down on venture funding to build up our team with more skills and resources, and we scheduled the first certification test of our software in Fort Huachuca.

There was just one problem.

How was JITC to test software they had never seen? Test procedures had to be written. To write the test procedures, JITC had to know what they were testing. But they had never seen any software like this before. They were new to recordkeeping, let alone to this new breed of software. To compress a long story into a few short milestones, I found myself routinely visiting the Fort Huachuca labs, on the Mexican border, over a period of several months. I had become accustomed to the folks at the lab, and they were familiar with me. Being an active military base with a drone airport, the fort was a typical bustling US military base and I fit in with the many other contractors and civilians working on day passes.

Once the test procedures were finalized, the big day came for the test of our software – the first ever for the JITC team. I showed up and joined the team at the lab, as usual. Security was unusually rigorous however. The US had launched the Second Gulf War since I was last at the lab. Unbeknownst to me, each base raises its security level when the country goes to war. I was stopped in the hallway by military police and asked to show my ID. My Canadian passport was all I ever needed to show in the past, but not any longer. On day one of the test, I found myself literally out in the Arizona Desert with my Canadian passport in hand. My laptop, with all of its test data and test procedures, had been confiscated and was to be erased forever. I will never be able to tell the story of how we recovered from that, but suffice to say it involves really fine folks from the US military, hot desert, and a lot of Kentucky Fried Chicken and Coca-Cola over many days.

And so the world now had its first 5015.02-certified software. Other software firms were lining up to schedule their certification tests, and JITC was booked up with tests for the next 2 years. The world of electronic recordkeeping software would never be the same. The primary focus of the software became 5015.02 certification. This standard obligated all the vendors to build these features into their software, or risk losing market share. This standard ultimately shaped the overall architecture and feature set of all compliant software. Our local Ottawa competitor went on to get certified, as did our Australian competitor.

Once the US standard had been well established, Australia followed up with their own standard called the Victorian Electronic Records Strategy (VERS). It would be several years before the Australians developed a software certification testing regimen to accompany the standard. Other standards began popping up:

  • ICA, Module 2 Principles and Functional Requirements for Records in Electronic Environments. This eventually became ISO 16175-2:2011.
  • MoReq (MoReq1, MoReq2). Originally intended as a European

It appeared to me that each of these standards was attempting to somehow outdo or at least add onto the US standard. At the time, I felt the US DoD 5015.02 standard to be absolutely essential to sell into the US market.

MoReq appeared to me to be asking for too much of the software, such as the ability to “preserve the readability of the electronic document forever”. Good luck with that! I struggled to take it seriously, and I never encountered a single case of a buyer demanding compliance with it. I admired the VERS standard, as it indeed built on 5015.02, but once again it seemed to only matter to Australian buyers, whereas I, and I believe most of my competitors, were focused on the US market.

So what about Canada? Our federal government, via Treasury Board of Canada, eventually selected ICA Module 2 as the standard for government of Canada software. This standard is very different from 5015.02 in that it’s written from the perspective of an archivist, primarily concerned with capabilities that support the preservation of the records. The ICA standard simply is not as focused on the active lifecycle of the records –much of what it calls for is downstream of the active lifecycle. Additionally, most of the standard’s input came from Commonwealth countries, and it really shows. The old-fashioned block-numeric classification system, and the security levels (Confidential, Secret, Protected Top Secret, etc.) are reflected in it. ICA Module 2 calls for a breathtaking 275 capabilities versus 168 with 5015.02, some of which, similar to MoReq1, were simply aspirational and impossible to deliver with today’s technologies.

I’m not advocating one standard above another – all standards serve a useful purpose. I worry however when I see the word “should” in a software standard. You cannot test a software feature against a “should” requirement. To me, a software standard is only a software standard when real-world technology can be tested against it. All other standards are aspirational, and no software vendor should ever claim compliance with an aspirational standard, but some do.

How were the e-records software firms doing? The Australian firm was doing the best of the three I’ve been mentioning, as they appeared to be selling more software (albeit mostly in the Pacific Rim Region), and the company was growing larger and faster than the others. They were concentrating on the local Australian and European market, and doing well. The company grew to around 35 people or so. They were somewhat late in obtaining US DoD 5015.02 certification, but eventually they set up a small beachhead in the USA by hiring a well-respected former US air force pilot, later turned military records program manager, to represent them in the US. They were primarily delivering solutions aimed at physical (paper folders, boxes) records solutions. The management of electronic records always appeared to me to be a secondary effort for them. Similarly, our local Ottawa competitor largely sold paper- based recordkeeping solutions, with electronic recordkeeping as a secondary effort. They soon obtained 5015.02 certification, and won acceptance for the Canadian government in a “lowest-wins” bidding competition that would eventually be known as RDIMS (Records and Document Information Management System)1. While we were disappointed in this, we were convinced the US market was where our future would be written, not in Canada. Unlike all of our competitors at the time, our solution was purely for electronic records, with physical records as a secondary feature set – the exact opposite of our competitors.

Why did we not focus on paper records? Because our mission was to deliver solutions for electronic records, not paper records. The world was awash in competent paper records handling solutions. We stayed laser-focused on that mission!

The first 5015.02 certification galvanized the entire electronic recordkeeping software industry. Potential buyers were calling for compliance with the standard. If you didn’t meet the standard, you could not sell software to much of the US market. And for a while, only one small Canadian software firm met this standard. The big document management software companies were suddenly paying attention….

 

Document Management

While e-records software was unfolding from 1983 – 2002, there was a parallel market developing at the same time

  • Document Management. These were products that provided an organized repository for all the electronic documents that were being created, and a powerful search capability so users could find the documents they needed. It made no sense for all users in a business to store their documents only on individual computers. The demand for Document Management was growing rapidly. Only this was no small business like electronic recordkeeping – this was truly big –really big – By 2002 this type of software was becoming pervasive in organizations around the world. It seemed then, as it does today, that pretty much all organizations larger than a few hundred users need some form of document management.

Document Management had grown in market install base as well as capability, and had by 2002 morphed into “Content Management”, or Enterprise Content Management (ECM) to be exact. IBM had its Content Manager platform. OpenText started out with its LiveLink ECM offering and through several acquisitions, including Canadian firm Hummingbird, grew its Content Server ECM platform. FileNet had its well-known P8 ECM product, heavily focused on image management. But the granddaddy of them all was Documentum out of California. Documentum completely dominated the pharmaceutical industry with large-scale ECM solutions for big-pharma companies around the world. If it was huge, it was usually Documentum. IBM was no slouch either – they had installations all around the globe, including one well-known installation (US Social Security Administration) of over 200,000 users. Hewlett Packard (HP) soon entered the ECM market with their acquisition of Autonomy. Remember that all of these “new” ECM platforms were merely the continuing evolution of document management – just with a fancier name and ever- increasing capabilities. Records management software meanwhile was a tiny, specialized industry being developed by a small number of relatively tiny software vendors. These two markets had yet to intersect in any meaningful way.

ECM was ECM. Records management was records management. And that was that….

Market consolidation has been relentless in this ECM market segment. Documentum was acquired by EMC, and EMC has since been acquired by OpenText. IBM has acquired FileNet.

These companies paid little attention to the electronic recordkeeping market – they had no reason to. However, with the first certifications of 5015.02, they were seeing “compliance with US DoD 5015.02” appearing in the “Mandatory” column of the RFPs they were responding to. Many of their US government buyers and prospects were telling them “We can no longer purchase your products unless they are 5015.02 compliant”. In Canada, and doubtless in other countries, compliance with 5015.02 was appearing on a good number of RFPs for ECM technology.

There was just one problem.

What the heck was this 5015.02 thing? ECM vendors were completely caught off guard – they did not see this coming. They had two choices. – build, or buy. They could design and build these strange new features into their products, and ultimately achieve certification. Or, they could buy existing technology and incorporate it into their own products. Remember that by this time, we had been designing, building, and perfecting this technology for over a decade, with 100% exclusive laser focus on this market niche. It took an enormous amount of specialized knowledge, a large team of highly skilled developers, and a great deal of time, effort, and money to achieve 5015.02 certification. This work could not possibly be done quickly or easily, even by mighty IBM.

ECM vendors IBM, FileNet, Documentum, and OpenText all needed this certification – and the sooner the better, as their lack of certification was having a negative impact on their sales quotas. There were three potential acquisition targets – my company in Ottawa, our local Ottawa competitor, and our friends in Australia. Four companies who needed certification in a hurry, and there were three small niche software companies who had the technology.

Here’s what happened. I left my own company and formed a new company to build a second generation of the technology we felt was essential to success. My previous company was acquired by Documentum. My new company was acquired by IBM. My local Ottawa competitor was acquired by OpenText. The Australian company decided to hang tight and make a go of it on their own. FileNet, left at the altar with no bride, and had no choice but to build recordkeeping features themselves, which they immediately set out to do. Several years later, the Australian company was finally acquired by Hewlett-Packard.

With the notable exception of the Australian firm I’ve been referring to, the nascent e-records software business essentially disappeared almost overnight, having been swallowed up in the rush by the ECM vendors to become DOD 5015.02 compliant before their competitors. The few small specialized vendors had now been swallowed up – there were none left except for the Australian firm. The ECM vendors were now calling the shots. To this day I believe this is the right thing for the market. I came to learn that organizations with recordkeeping requirements have no choice but to buy software that meets their needs (usually expressed as 5015.02 compliance), but that doesn’t necessarily mean they end up using the features they buy. Recordkeeping was now just a “feature” of modern ECM. If you wanted electronic recordkeeping, you had to buy ECM first. Recordkeeping became a mandatory check-box in most RFPs for the acquisition of ECM. ECM sales kept climbing, and everyone was happy.

I cannot speak for the other acquisitions, but IBM did a fantastic job of absorbing my company. It meant a lot to IBM and it was extremely well executed. But I was anything but happy. After a few years, I noticed that despite all the sales and delivery of electronic recordkeeping solutions, few buying organizations were actually deploying the solutions. This was not just within IBM – but right across the market. ECM was being deployed with recordkeeping capability, but I could find no evidence that the new ECM recordkeeping capabilities were being used to truly manage electronic records.

I encountered many organizations who claimed to be managing their electronic records. When I looked into their projects, several disturbing things emerged time and time again. Many (too many) organizations had no idea how to go about deploying electronic recordkeeping. Some had included recordkeeping on their ECM orders because they had to, but never deployed it (shelfware). Many tried and failed to deal with electronic records, and reverted to managing paper records only, leaving electronic behind as unmanaged documents. I met one large US firm, a household name, who tried three times and spent millions trying to deploy electronic recordkeeping with two different ECM vendors, and eventually gave up in frustration.

To me, a successful deployment of electronic recordkeeping is defined by two simple criteria. First, electronic documents are being properly managed as records on a regular, daily basis by all users. And secondly, the corporate records manager is carrying out disposition of electronic records, in accordance with the retention schedule. On countless occasions, I heard stories of successful electronic records management. But once I peeled back the project’s layers to seek out my two success criteria, I’d never find them. Ever.

The ECM software obviously met recordkeeping requirements, at least as laid out in 5015.02. Many ECM vendors went well beyond 5015.02 and delivered features such as content searching, e-discovery, and physical records handling. Most even delivered some form of email management, a critical component of electronic recordkeeping, as so many electronic records are emails. The ECM platforms were delivering strong, powerful features. Successful ECM solutions were being delivered from large, global companies with highly competent people.

So why wasn’t the electronic recordkeeping portion of these ECM solutions successful, even if the underlying ECM project was completely successful? As it turns out, there were a number of clear, obvious causes. Over the years, I began to notice the root causes (project “factors”) across many different projects, across different ECM technologies, and across different types of organizations, that were preventing electronic recordkeeping from taking root and succeeding. Sometimes, a single factor alone was fatal to the recordkeeping. Sometimes a combination of multiple factors were present. Either way, the outcome was not good – electronic recordkeeping was not taking place.

Following are the four project factors I had been seeing, and I am still seeing them to this day:

 

Low Recordkeeping Priority. A move to enterprise-wide ECM is a large, costly, and risky adventure for any organization. It represents a major shift in the organization’s Information Technology (IT) landscape and is extremely disruptive to users. A great deal of money and reputation is invested in ECM projects. It’s important that they are successful, and the tolerance for failure is low. In some cases, IT people feel their reputations, and even their positions, are at risk in the event of a failure. Not so with the recordkeeping aspect of the project. Recordkeeping is just “a minor part of the project”. Nobody will wind up an ECM project if the recordkeeping element of it is not satisfactory. Nobody’s reputation or job is on the line. If the recordkeeping part of a corporate ECM deployment fails – then ECM life goes on, without recordkeeping.

Relative to the overall ECM project, the importance of recordkeeping is small. In many organizations, the ECM is “too big to fail”, which renders the recordkeeping side of the project essentially disposable. In my experience, and this continues to this day, far too many ECM deployments have failed with electronic recordkeeping, and the corporate RIM professional is left managing paper records with the sophisticated electronic recordkeeping features sitting idle.

ECM Vendors know that as long as they meet requirements, they can sell their products. If the recordkeeping does not pan out, they’re not likely to see too much heat from a buyer who, in too many cases, doesn’t understand the technology well enough to understand how to deal with it.

In my view, this is an unintended consequence of recordkeeping being a feature of ECM. At the outset, everyone agrees that recordkeeping is a “must have” feature. So ECM proceeds. Later, recordkeeping fails. Are we going to throw out the entire ECM project and start over? Of course not. ECM is safe – it is always safe, and recordkeeping is not.

 

Ill-Equipped RIM Professionals. Some RIM professionals misinterpret my position on this and feel I’m saying RIM professionals are not able to handle this technology. But nothing could be further from the truth. Today’s RIM professionals are more tech-savvy, better educated, and better equipped than ever to deal with technology. But despite this, many of us are still sitting ducks in front of advancing ECM technology. The corporate records manager has to fully understand the incoming electronic recordkeeping capabilities of the new ECM software. That’s a lot to learn and master, usually in a short period of time. They must become 100% comfortable with the ECM technology itself by asking these questions: how does it work? How is metadata defined? How can I control the metadata values? How do I manage documents as records? What do I have to ask of the users? They have to get their retention schedule into the ECM. This is usually like fitting a round peg into a square hole. Most retention schedules are not structured suitably for modern ECM systems, and they need to be massaged at the very least, or completely restructured in some cases.

The RIM Professional has to master the ECM technology, then master the electronic recordkeeping capabilities, get the retention schedule re-shaped and loaded, then heavily influence the manner in which the ECM is configured and deployed in order for the recordkeeping to work properly. Usually, all of this has to happen on top of a 40-hour work week filled with everyday recordkeeping responsibilities. To put it mildly, this is a pretty big challenge for the best of us.

I also find that the effort and investment the ECM vendors put into the recordkeeping component is small relative to the effort put into the platform as a whole. I know of one ECM vendor who came perilously close to dropping recordkeeping altogether because “it wasn’t important enough”. In my opinion, the documentation, training, professional services, and aftersales support for the recordkeeping features often lag well behind the rest of the ECM platform. I have no doubt that this is merely a reflection of the priority of the recordkeeping features relative to the platform itself, but nonetheless there is a great deal of room for improvement in this area across all of the vendors. At the end of the day, it’s another element that makes it even more of an uphill climb for the RIM professional to come to grips with the technology.

It’s not the fault of the individual RIM professional – it’s the fault of the industry at large, including the vendors and the buyers. There is a massive educational gap that is getting worse, not better.

 

ECM Deployment Failures. Not all ECM deployments go well. I’ve not seen any two estimates of ECM deployment success that even define success the same way, let alone show any consistency of results or measures. Therefore it’s impossible to put a number to successful ECM deployments. But whatever that number is, it certainly is not close to 100%. Simply put, the ECM is the “vessel” electronic recordkeeping is riding in. If that vessel is unhealthy, and is rejected by end users, there’s nothing that recordkeeping can do to overcome it. The electronic recordkeeping capability of all ECM solutions need ECM to be widely adopted by all end users, and capturing not only the electronic records themselves (including email), but appropriate metadata that recordkeeping depends on. If this is not happening, it’s pretty much impossible to manage the documents as records, by applying proper control to them, classifying them against the retention schedule, and applying disposition to them at the end of their lifecycle.

 

The RM-IT Gap. Because recordkeeping is a feature of the larger ecosystem of the ECM, it cannot operate independently of the ECM. The records themselves are documents stored within, and fully under the control of, the ECM. The ECM must be configured to determine the metadata applied to the documents. The ECM determines and applies security permissions. And it specifies where documents are stored, by defining locations such as folders, libraries, or whatever nomenclature the particular ECM applies to these locations.

This means that in order for ECM recordkeeping features to work, the RIM professional has to work very closely with the IT team to configure the ECM itself. Usually however, IT is simply not comfortable with the idea of RIM people telling them how to configure ECM. Worse still, few RIM professionals are comfortable with configuring sophisticated ECM systems. They have to learn a whole new technology and skill from scratch. The few RIM professionals who tackled this with gusto were often rebuffed by large IT departments who were moving ahead with ECM configuration according their own blueprint, regardless of RIM. In too many cases, IT proceeded with ECM planning, configuration and deployment according to their own vision and approach, with little regard for the recordkeeping features. The RIM professional often lacked the mandate, the political clout, or the know-how to influence the ECM deployment to the extent necessary for a successful electronic recordkeeping deployment. I found the larger the organization, the larger the gap I would likely see. IT would blast ahead with ECM deployment, and the RIM professional would manage paper records, trying in vain to influence the ECM direction.

 

By 2012, many ECM vendors were starting to feel somewhat uncomfortable with the recordkeeping side of their projects. They perceived recordkeeping as complicated – a lot of trouble. Surely something simpler would do the trick?

By 2012 most ECM products were shipping with the ability to apply “Policies” to documents. A policy was a set of behaviours and characteristics that could be automatically applied to a document. A policy could be applied to a location (e.g. folder) such that all documents in that location automatically inherited the policy. Such policies specified how long the document was to be retained, when it would be destroyed, and any criteria that had to be met before the document was deleted. Sometimes these policies respected the retention schedule, but they did not always have to – it was possible to create all the policies you wanted, without any regard for the retention schedule.

This new capability became quite popular with most ECM products, because it was seen as a way that their customers could delete documents without having to bother with the trouble and overhead of records management, which more often than not was not going nowhere fast. Many organizations proceeded to use these simple policies that deleted documents based on IT-imposed criteria, or criteria that originated from the business units with no regard for the retention schedule. It was better than nothing, they reasoned. For me however this was a disturbing trend, more proof that the industry at large was not making any headway in deploying electronic recordkeeping.

To this day, many ECM vendors offer formal recordkeeping capability, but they also offer this generic “policy capability”. On more than one occasion I heard policies referred to by one vendor as “records light”, which made me cringe. Having spent my entire career building electronic recordkeeping technology, I was deeply disturbed with the realization that with few (if any) exceptions, there were no genuinely successful deployments of electronic recordkeeping that met my simple criteria. So I left IBM and formed a vendor neutral consultancy with a single mission – to help buyers deploy their electronic recordkeeping projects successfully.

While all this was going on Microsoft had entered the ECM fray with their SharePoint offering in 2001, which few established ECM vendors took seriously. Anyone in the software business however knows not to take Microsoft lightly (ask Michael Cowpland). By 2012 Microsoft had slowly but steadily eaten into the market share of all ECM vendors, from the bottom up. The established vendors like ECM, IBM, and OpenText had to pay attention now, as SharePoint was growing up and becoming a genuine market challenger. And like all their ECM competitors, Microsoft soon ran into this thing called “records management”.

 

Microsoft

Microsoft SharePoint’s penetration of the market has been a slow, inexorable march to a point today where it is a force to be reckoned with. Some figures I found widely disseminated on the web:

  • SharePoint is a $2B business within
  • Microsoft claims 20,000 new SharePoint users are added every day.
  • 80% of fortune 500 companies use SharePoint
  • Microsoft claims 66% penetration of the enterprise

The numbers above are years out of date (I could not find up to date numbers), but suffice to say SharePoint has a major presence in the modern enterprise. That means more and more SharePoint sites are going to – you guessed it

  • be storing documents that need to be managed as

I ended the previous stage at 2012, for a good reason that I’ll get to shortly. For the moment I need to momentarily take you back to 2006, when Microsoft’s adventure with recordkeeping begins.

In 2006, Microsoft was starting to deliver SharePoint deployments to increasingly large US corporations and government agencies. It was a viable ECM contender by then, taking market share away from the bottom end of the share held by IBM, FileNet, OpenText, and the others. Some of their most cherished customer accounts delivered Microsoft an ultimatum. Build in recordkeeping compliance, or we have to drop SharePoint for a competitor.

So to great fanfare, Microsoft announced recordkeeping capability in their new SharePoint 2007 release. Suddenly the web was flooded with countless articles from Microsoft, and from SharePoint experts of all sorts, telling the world how to manage records with SharePoint 2007. You could now create policies that would delete documents automatically, in accordance with a retention schedule. I undertook to dissect this new capability to evaluate it for myself, and for my clients. I was most distressed by what I found. I read everything I could find on the web. I flew to Microsoft’s head office in Redmond, Washington, and spoke to them about these features. I met most of the five- person SharePoint records management team members (one was from Montreal – a huge Habs fan). Great people, but I can tell you they had absolutely no background in records management. I even had the rare privilege to interview Microsoft’s chief SharePoint architect.

My conclusion – they blew it. There was no means of inputting or managing a proper retention schedule. There was no way to properly manage case records. And there was no disposition process. SharePoint simply went ahead and blew away the documents as soon as a time deadline was reached.

Microsoft launched a full-court press on records management. The massive constellation of SharePoint partners and experts were re-purposing and re-publishing Microsoft’s core messages about recordkeeping. To this day, I see thousands of web pages that tell us in excruciating detail how to manage records with SharePoint. I told Microsoft how they got it wrong. The response? Polite acknowledgement on how I was obviously wrong. They sent their corporate records manager out on the records management speaking circuit to tell the story of how they manage their own records in-house with SharePoint, and how the rest of the world can do the same.

Perhaps it was just me, but I was feeling very much like a pariah. I was a lone voice in the wilderness directly contradicting mighty Microsoft. As I found out, Microsoft has an awful lot of friends – their worldwide network of partners and dealers, all of whom were making money selling SharePoint services. And of course I was telling organizations that SharePoint did not manage records, while the firms were telling the same organizations that they could readily deliver electronic recordkeeping solutions with SharePoint. Microsoft backed them up every time. Who backed me up? Nobody of course. RIM professionals around the world were deeply skeptical of Microsoft’s claims, but it seems nobody could articulate exactly what it was that was wrong with SharePoint.

In 2009, Microsoft claimed they had heard the problems associated with SharePoint, and they announced, again with great fanfare, a slew of improvements to the recordkeeping feature set. Records in-Place! New Content Organizer Capabilities! New Records Center! Multi-stage retention policies!

The announcements hit most of the right points. By now some degree of skepticism had crept into the market about Microsoft’s ability to manage records. But again I went back to Microsoft to find out for myself what had changed. My conclusion – not enough. Nothing that would overcome the original (three) fatal shortcomings. By now I was getting somewhat tired of answering the same questions over and over again – what exactly is the problem with SharePoint? Few people were taking me seriously. Microsoft was still selling recordkeeping in SharePoint, and let’s just say I was not a fan favourite of the SharePoint partner world.

I’d had enough – something had to be done. My own reputation and credibility were starting to take a hit. So I wrote a detailed report that put the hard facts in writing. I knew I’d be challenged on every detail, so I researched every detail carefully, and validated everything with Microsoft. The report carefully stated the shortcomings, and detailed how to customize SharePoint to get it right. I asked Microsoft to review and verify the entire report for accuracy, which they did. They granted me permission to use a statement claiming that Microsoft had reviewed the report for factual accuracy. I offered the report to anyone who wanted it. ARMA International https://members.arma.org/eweb/home.aspx?site=ARMASTORE) published it as a book. It soon circulated around the world. What happened? Nothing. Which was the best possible outcome, in my view. It was never challenged. Over the years, and not just because of my book, more people came to realize that perhaps Microsoft really did get it wrong. Microsoft even warmed up to me somewhat – they invited me to join their Canada-wide roadshow promoting SharePoint. I accepted their offer to explain how to manage records with SharePoint. That lasted exactly one session. Microsoft very nicely dumped me for being “too negative” on the recordkeeping. Back in the doghouse I went….

It’s time now to bring you back to 2012, where this third stage begins. Some enlightened software vendors in the records management space were seeing a market opportunity emerging with Microsoft floundering with records management. A whole new market segment was born – recordkeeping add-in software for Microsoft SharePoint. As of the date of this report, there are four vendors delivering these solutions – two in the US, one in Australia, and one in Canada. One of the four brings in 5015.02 certification to Microsoft, and two of the other vendors have committed to the certification.

Now a SharePoint buyer could finally get genuine recordkeeping. Each of the four add-in vendors took a radically different approach to achieve recordkeeping with SharePoint, but they all got the job done, and done correctly, after a few false starts with a couple of the vendors. But what if Microsoft someday gets it right, and releases real recordkeeping as a native SharePoint feature? That would likely wipe out this nascent market of recordkeeping add- ins. In my opinion, I do not see that happening. Microsoft tells me that as long as there is a healthy market for recordkeeping add-ins, they have no business case for diving back into the records business again. They prefer to focus on the platform, and encourage an ecosystem of third party products and services to use that ecosystem to deliver customer solutions. As long as people buy SharePoint, they’re happy. So Microsoft is happy, and the RM add- in vendors are happy. Hopefully the SharePoint buyers will be happy too! From what I see, so far so good. One new RM add-in vendor joins the market every 2 years, and I look forward to every new vendor.

But this market segment of just four relatively small vendors is doing something radically different than what I’m seeing in the rest of the ECM market (the Non-Microsoft vendors). The pace of innovation is astonishing. These RM add-in vendors don’t have to worry about any of the plumbing or architecture of an ECM – they put 100% of their effort into managing records, while leaving the tough ECM stuff to Microsoft. They don’t have to worry about a repository, about creating metadata, or even about searching – that’s all done for them. They’re free to innovate new ways to apply recordkeeping to documents, without worrying about anything else. Within Microsoft’s ECM competitors, relatively few resources are applied to the recordkeeping capabilities. But these RM add-in vendors have formed entire development teams, marketing teams, support teams, all devoted 100% to recordkeeping.

And there’s another radical difference. Most of the legacy ECM products are what I refer to as location-based, in that many of the behaviours and characteristics of documents are determined by the location in which document is stored (which folder, library, etc.). That means that location matters, and users are constantly worrying about where something is stored. SharePoint turns that on its head – with a location-independent approach, where location does not matter. One of the most pervasive end user objections to recordkeeping I’ve encountered over the years is that users do not want records to dictate where they store their documents. Until now, most (not all) ECM solutions dictated where the user had to store documents in order for recordkeeping to happen correctly.

And this is where the magic has truly taken place. These four vendors have finally produced genuine recordkeeping automation. Rules-Based Recordkeeping (RBR) is a software capability that allows the RIM professional to fully automate the following:

  • Determine which documents are
  • Decide when to declare documents as
  • Classify against the retention schedule (correctly!).
  • When to move the record to a long-term

This is the breakthrough I’ve waiting for 30 years to see. For 30 years, we’ve been depending on users to identify which documents are records, and to classify them against a retention schedule they do not care about or understand. This dependency on the end user has been holding this technology back since day one. It appears to me that these days are finally over. To be fair to the non-SharePoint vendors, some of the traditional ECM products have incorporated some degree of this automation, but I’ve not seen anything close to the level of automation I’m seeing in the SharePoint market segment.

If it sounds as if I’m promoting SharePoint solutions over other ECM products, well again you’d be wrong. Traditional ECM vendors will always have a strong market position. Even Microsoft readily admits they cannot be all things to all people. I have clients with ECM requirements that could never be met by SharePoint, and I doubt ever will. One of my clients uses Documentum for millions of aircraft maintenance records and drawings – they are not going to switch to SharePoint in my lifetime! Many installations of mature non-Microsoft ECM platforms are now finding that SharePoint usage is “creeping into” their organizations. Some will inevitably switch to SharePoint. Some will ignore SharePoint and carry on happily with no issues. And some will try to connect the two so they “work together” – SharePoint for document creation, and the traditional ECM as a formal records repository. Technically possible but a bear to implement and manage.

All non-Microsoft ECM vendors have the means of delivering good recordkeeping within their solutions. It just takes more time and work to reach the goal. And there’s absolutely no reason why these ECM vendors cannot build the same powerful RBR automation capabilities into their products. I would certainly encourage them to do so.

 

Where Are We Now?

ECM systems that deliver electronic recordkeeping capability have become well known in the market these days as EDRMS (Electronic Document and Records Management System). I’ll use this terminology from this point forward.

There are now three distinctly different EDRMS technology “streams” in the market today. They are:

 

Traditional ECM           These are the large ECM vendors I’ve been referring to (IBM, OpenText, etc.). In order to get EDRMS capability you have to invest in a sophisticated ECM platform, then utilize the recordkeeping component. As of the date of this report, these vendors have not innovated to the same RBR level as those in the SharePoint stream, but that could change very quickly.

 

SharePoint                   Here you have to buy a third party add-in to augment SharePoint. That’s a downside – any IT professional will tell you that integrating separate products is never the best technology option. However you get a higher level of focus and innovation from the RM add-in technology. And the big bonus is RBR – you can automate most recordkeeping functions such that you no longer need to depend on the end users to meet your recordkeeping performance numbers.

 

Independents               There are a few (I can identify only three, perhaps four) of records-specialist software firms who are catering primarily to RIM professionals, and have extended their products to include ECM capabilities. These products are “records- forward”, or “records centric”. By this I mean these products are all about records, and are aimed at the recordkeeping market. I will include HP, with its HP RM offering, in this stream. Although the technology configuration for HP RM is the same as the other two in this stream, the company is quite different. HP is a global powerhouse, whereas the other two are niche market players, and their size reflects this.

 

All three have their place in the market, and will continue to ship solutions that work. Very rarely (if ever) have I seen an organization select an ECM technology based on any recordkeeping requirements. Most often, the IT department will select either SharePoint or a traditional ECM technology, and run with it. The recordkeeping options then fall out that particular technology choice. If they select OpenText for instance, they will have to use the recordkeeping options of OpenText. If they select SharePoint, they will have to select one of the four RM Add-in products to utilize.

I do not like what I’ve seen to date with solutions delivered within the Independent stream. I’ve seen some serious difficulties achieving end user adoption, particularly in small to medium deployments (fewer than 1,000 users). These products will likely fit best in corporate cultures that accept a strongly records-forward operating stance. I believe it will take more work to achieve end user adoption in this stream than with the other streams.

Remember the four problems I mentioned earlier that have been plaguing EDRMS projects to date? They’re still there. So what has changed? The technology has changed, particularly within the SharePoint stream. With RBR, the odds that you can overcome the barriers to success just went up – way up. If you suffer a low priority of RM versus ECM, it’s not quite as difficult to overcome. It’s not as difficult for the RIM Professional to skill up and become equipped for success. The other two factors really don’t change. The ECM itself is still challenging to successfully deploy. And if the RIM professional and IT are not working closely together, recordkeeping is still doomed.

 

Email

By my reckoning, emails comprise anywhere from 30-80% of an organization’s total digital records, all of which need to be controlled and managed properly as records. Put another way, there are often 3-5 times more emails than documents that meet the criteria of a business record, and these emails should (must) be managed as records. No organization can claim they are managing their electronic records unless they are applying records control to their email.

Technologically, email remains the Achilles heel of any modern EDRMS project. The problem is that most organizations use Microsoft’s Outlook/Exchange platform for email. This email platform has no recordkeeping capability whatsoever. It’s an island of massive volumes of stored information that’s disconnected from the ECM in every way. Even Microsoft’s SharePoint ECM platform is not connected to email. In the ECM stream, each ECM vendor has to write special integrations between their products and Exchange in order to provide their users an easy way to get their emails into the ECM where they can be managed as records. In the case of SharePoint, yet another third party product is required to integrate email with SharePoint. For any reasonable EDRMS solution in the SharePoint stream therefore, three separate technology acquisitions are required – SharePoint, the RM add-in, and an Email Integration product.

Even when email is tightly integrated with the ECM platform such that users have an easy way to get their emails into the ECM, the choice of emails to put into the ECM remains a voluntary end user decision. To a user, this means they have to decide which emails are important to the organization. Then, they have to go through the process of actually submitting the email to the ECM, which means they have to fill in mandatory metadata that identifies what the email is about. This adds up to end user effort, which once again puts us back into the dark years of the past where we depend on end user discretion and effort. And we all know how that has served us to date!

There are some (I can identify one for sure, and perhaps a second) innovative new technology solutions that actually use software artificial intelligence, or some semblance thereof, to read the email in the inbox, and determine if it is a record or not. It ranks all email by likelihood that it is a record, and how it should be classified. Does this technology work? Sort of. In some circumstances, with certain types of email (predictable, well-described email), it works very well. In other cases, it’s positively awful. Overall, it has great promise, but it’s a long way from everyday usability. I note also that the cost, and the technology overhead necessary to support this whiz-bang capability, will take your breath away. Therefore I believe this capability is only suitable for large-scale, well-funded projects with extremely deep resources.

 

The future of electronic recordkeeping

When I ponder the future of recordkeeping, I could readily try to predict plenty of technology trends – that’s always fun. I could predict that we’ll find ourselves pretty much entirely in the cloud someday. I will certainly predict that the ECM vendors will catch up with the SharePoint RM add-in vendors and deliver RBR. But these predictions are not going to help us. Besides, that’s not what I’m seeing in my crystal ball. All I see in the future is the one thing that obsesses me these days. That one thing? Education.

That’s what we need more of. Lots of it. All the great technology in the world won’t help us if we don’t know how to utilize it to meet our goals. We have plenty of technology, and that technology has now advanced to the point where we can largely automate the recordkeeping processes. In the past, the technology has been holding us back. Not anymore. Now, we’re holding ourselves back. We need to better equip RIM professionals with the knowledge and skills to understand ECM to the level where we can influence how ECM is configured for deployment. We need to understand this modern EDRMS software better, to learn how to automate using RBR. And we need to better understand EDRMS project management methods and techniques, such as defining key performance measures to ensure the health of the projects.

Not too long ago I suggested to a large, global records management firm that EDRMS projects were not very successful, and that a massive, large-scale education program was needed to turn the tide. The reaction? “Our customers are doing just fine – you don’t know what you’re talking about”.

We now have the technology to get the job done. Think of a modern passenger airplane sitting on the tarmac. It’s nothing but a useless, expensive piece of wonderful technology if we don’t have a pilot with the training and skills to fly it. In the EDRMS business today, there are far too many planes parked on the tarmac. So what’s holding us back?

 

Works Cited

1As of the date of this paper, RDIMS has since become known as GCDOCS.

From Chaos to Order – A Case Study in Restructuring A Shared Drive

 

By Anne Rathbone, CRM and Kris Boutilier

 

Synopsis

In 2011, the Sunshine Coast Regional District (SCRD), in British Columbia, embarked on a project to restructure the shared drive used by all its employees. The new shared drive went live for staff in March, 2012 and by December of the same year, all records were migrated to the new drive. This case study features the SCRD’s management of electronic documents, how the new drive was restructured and the lessons learned from the project.

 

Introduction

Incorporated in 1967, the SCRD is one of the smaller regional districts in British Columbia, encompassing almost 3800 square kilometres of land and providing services to more than 28,000 residents. There are about 250 employees spread over 14 sites and nearly 200 use computers in some fashion and therefore generate electronic documents.

Regional districts are similar to counties in other parts of Canada, providing municipal-style services to unincorporated areas as well as providing region-wide services to member municipalities. Local governments, including regional districts, are often segregated along business lines. The SCRD has 98 diverse and entirely distinct business units including:

  • Accounting, payroll, financial and investment services;
  • Legislative services and bylaw compliance;
  • Solid waste management;
  • Water supply and distribution;
  • Parks planning and management;
  • Four distinct fire departments;
  • Five separate recreation facilities; and
  • Transit

Each of the departments has very specific needs with regards to managing records and often these needs are driven by external legislation. As an example, our Fire Departments not only respond to fire and accident calls, they also need to be able to manage their radio licenses, leases for radio towers and they have acquired their own records management system (Fire RMS). This software is to specifically manage their emergency records and is maintained and operated by an external service provider. In addition to the four fire departments directly operated by the SCRD, there are also two independent fire departments – Sechelt and Pender Harbour. Although all six departments operate independently, the SCRD operates a service for all of them to provide 9-1-1 dispatch and to maintain the dispatch radio network, which is handled through third party contractors.

However, as we are legally one single organization, the information must be treated consistently with respect to handling, classification and retention, regardless of how or where it is created.

At the time the SCRD embarked on the shared drive restructure project, there was no system in place to manage electronic records. The details of the state of the SCRD’s electronic records will be provided later in this article.

The classification schedule from the Local Government Management Association (LGMA) of British Columbia’s Records Management Manual was being used for hard copy records, which are maintained in a central file room and an in-house inactive records centre. The structure of the schedule is such that it reinforces the concept that the SCRD is ultimately one corporation, even though it usually thinks in a compartmentalized manner. As an example, under the large function “Engineering and Public Works”, there is a primary 5280 Environmental Management and secondaries related to air quality, chemicals, noise control, and so forth. Retention is applied at the secondary level. See Figure 1.

History of SCRD’s Shared Drive

Before the SCRD started the shared drive restructure project, finding a document was like finding your way through a maze. The original shared drive, called “H:” was established in 1994 to take advantage of a new Novell Netware file server. There were about 35 computers all at one site and all running DOS. Creating the shared drive was intended to eliminate the “floppy disk shuffle”.

The H: drive structure was modeled around the approximately 22 major departments that existed at that time. The contents of the departments’ folders were intended to be transitory – conceptually, draft documents were prepared within a department’s private folder and the final edition eventually moved to a shared access folder. Shared folders were organized by year and originating department, much like how the SCRD organized its correspondence files at that time. Any sensitive files, such as Human Resources, remained on floppy disk and were stored under lock and key.

The hard copy was considered the ‘R’ecord and the electronic version was simply for retrieval convenience and to allow reusing well-formatted documents such as agendas and minutes (this occurred long before Microsoft Word Document Templates existed).

Over time, the SCRD added dBase databases and other structured data systems such as Accpac. Those systems also stored their data on the file server where it was reliably backed up and protected against corruption.

There was a file name limitation of eight characters, with a three character extension. DOS did not require the three character extension to reflect the application used to create the file.

Therefore, when working with WordPerfect documents, it was often used to store the initials of the document creator.

The H: drive grew and evolved resulting in:

  1. ‘Walled gardens’, where departments could and did do anything they
  2. Folder structures that ‘evolved,’ theoretically to match a department’s

As technology changed, and with the elimination of the restrictions in file name length, users found they could create subfolders and include information in the subfolder title that more correctly belonged in the file name itself, such as creating “Year” folders instead of putting the year directly in the file name.

In 1998, the SCRD experienced a sudden and dramatic corporate restructure. The H: drive no longer substantially followed the SCRD’s organizational structure. With this change, and with ongoing growth in the size of the organization, there was an increasing need for collaboration across the departments. To handle this need, a set of “teams” folders were created, with explicit permissions granted to specific users collaborating on select projects. The H: drive and the SCRD’s organization structure continued to diverge as new services and departments were added without modifying H: drive.

Most importantly, permissions were modeled around the “named user”. Complications arose when a user left the organization or when collaboration on a project transferred to another employee.

By 2012, the H: drive contained over 465,000 files in over 40,000 folders. Due to the volume of files:

  • Duplication was rampant – some staff copied a document to where they thought they would be sure to find it again; and
  • Disappearances were frequent – other staff would find a document and then move it to where they thought it belonged, without advising the author of the

Since no rules had been developed to explain how records were to be saved, there was simply no way to train new staff on how the shared drive worked, what was to be saved there, or how it was to be saved.

Because of the walled gardens and lack of naming conventions, it was extremely difficult to find an electronic version of a document when the corresponding hard copy was destroyed. This made it impossible to apply the SCRD’s retention schedule to electronic documents.

As a result, the SCRD faced numerous issues:

  • A lack of confidence that the searches for electronic documents were sufficiently complete for either Freedom of Information requests or litigation and claims;
  • Concern there may be violations of the Freedom of Information and Protection of Privacy Act of BC (FOIPP) – some personal information collected in the course of business was stored in open folders;
  • Information leakage – some personnel documents and in-camera minutes were stored in open folders, permitting access beyond the required staff;
  • Zombie documents proliferated – if documents otherwise eligible for deletion were not saved in the appropriate folder they could linger forever, shambling back out into the light at exactly the wrong time;
  • Lost files, including the SCRD’s bylaws;
  • Volume of password protected documents grew and effectively became invisible to searches;
  • Collaboration between departments was difficult – resulting in documents being emailed between staff, which were then named and saved in different folders, amplifying the duplication problem; and
  • Proliferation of subfolders with a user’s name (for example, Jane’s Stuff).

Due to the lack of controls on subfolder creation, the H: drive had evolved unbelievably deep folder structures. SCRD staff lost files because the length of the file names, which included the file path, was so long.

All minutes and agendas were stored in a central folder, and correspondence was organized by year and month, then by originating department. So, staff had to remember the month they wrote a letter in order to find it again – and by the time the restructure project was started, there were almost 230 months from which to choose.

Figure 2 is a circular treemap, which is a graphical model that attempts to summarize the overall structure of a shared drive. The central back dot is the top level of the drive and moving outwards each ring represents a level of folders, so folders-within-folders quickly build out as ‘spikes’. The overall size of a spike indicates the depth of a subfolder tree needed to reach that folder. Figure 2 captures all 40,405 nested folders in H: drive, where every branch of folders has been growing independently of all the others. Particularly evident is the lack of symmetry, including seemingly random excursions as far as 14 folders deep. Every asymmetric aspect is an exception and has some unique reason for existing that must be contemplated when navigating – yet that reason is likely only known to the person who created the subfolders. It is also worth noting that records could be spread out across any levels of folders, anywhere, not just in the final folder of a particular path.

 

 

Figure 3 extracts the 8500 folders in the Infrastructure Department walled garden (the portion of the overall graphic outlined by the red wedge). The tiny black line on both scales of the treemap identifies an example folder at the end of the deep spike:\Infrastructure\Sustainability\1425-20 – Public Education Program\Admin\Email Saved\Recycling Feedback\General\For Curbside Pick Up\Roberts Creek

 

  

Budget Impacts

In 2003, consultants were hired to evaluate the SCRD’s requirements for an electronic document and records management system (EDRMS), which led to the preparation of a draft Request for Proposals. The consultants also included design recommendations for the new administration office building that was under construction. While the design recommendations were incorporated, the EDRMS funding was frozen by senior management as it was felt the project represented too much change for staff when combined with the move to the new building.

In 2007, a budget was put in place for a full-time Records Management Technician, with the goal of achieving two major projects: to inject the LGMA classification schedule into the H: drive and to implement an EDRMS. In 2008, a budget submission for an EDRMS was again submitted, but was not approved. And, in 2011, the SCRD Board of Directors approved $10,000 to improve the structure of the H: drive.

 

Project Structure

With the SCRD Board of Directors’ approval of funding, the restructure project was initiated. Two committees were struck: the steering committee and the working committee.

The steering committee consisted of the Corporate Officer and her Deputy, the Records Management Technician, the Manager of Information Technology (IT) Services and two of her staff. The Corporate Officer and IT Manager were tasked with communicating to SCRD management critical aspects of the project, such as:

  • Why the project was needed;
  • Timelines;
  • Work each department would need to accomplish;
  • Budget and staff

The steering committee quickly determined that the H: drive could not be fixed and that a new shared drive was required – the N: drive.

It was critical that the steering committee understood each other’s perspective in order to accomplish the goals of the restructure project. From an IT perspective, with storage becoming ever cheaper, Records can be kept forever as capacity is not an issue, search technology is improving very quickly, and there are innovations in computational knowledge extraction.

Records and Information best practice requires that Records be managed from creation to final disposition – just because we can maintain everything forever does not mean we should.

The working committee consisted of at least one representative from each department – the representative would be that department’s advocate during the project. In addition to bringing their department’s needs and wants for the shared drive forward to the committee, the representatives communicated to their respective departments the reasons why the project was needed and how the SCRD would accomplish the change. As well, the representatives were to communicate specific processes each department was required to accomplish such as:

  • Reviewing files in their H: drive folder and deleting what they could; and
  • Determining the subfolders the department would need in the new shared drive so the drive could be seeded before

A price request, project outline and desired outcomes were sent to several consultants. A request for proposals was not needed as the project budget of $10,000 was below the threshold set by the SCRD’s Purchasing Policy.

A consultant was hired in September, 2011. By November, 2011, on-site work had begun.

The on-site work included folder structure revision and enhancement, as well as interviews with staff to solicit specific knowledge of each department’s requirements. The original timeline was to implement the new shared drive structure on January 1, 2012; due to staff and consultant workloads, cut-over was achieved March 1, 2012.

The steering committee decided it was important to set a target date when the H: drive would disappear and felt six months would be sufficient – August 31, 2012. The H: drive would then be completely deleted by December 31, 2012.

The desired outcomes for the project were:

  • Resolve internal collaboration issues;
  • Implicitly purge electronic files that were beyond retention;
  • Reduce information duplication;
  • Improve the quality of search results;
  • Provide certainty for Freedom of Information and litigation searches;
  • Preparatory work for the eventual import of electronic files into a formal EDRMS; and
  • Rudimentary ‘knowledge capture’ from senior employees nearing

The project outline seemed simple:

  • Create a new folder framework;
  • Determine lower level folder structures;
  • Create a draft design of the folder structure;
  • Determine and document permissions;
  • Set up the new folders and infrastructure;
  • Train staff;
  • Migrate to the new folder structure; and
  • Delete the old

However, the scope of the project was quite large as it included all electronic records and email, as well as providing the ability to restrict access to confidential records.

In creating the folder framework, the SCRD required the consultant to design a high-level folder structure that would be universal across all departments. The consultant met with the steering committee to gather the requirements for the folder structure.

Part of the project outline was the requirement that the consultant help the steering committee adapt the LGMA schedule for use with the SCRD’s electronic documents. Using the same schedule for both paper and electronic would reinforce for users the concept that the information was the same, only the media was different.

 

Folder Structure

The initial high-level folder structure used the 16 functional folders from the LGMA schedule:

  1. Administration
  2. Assets and Procurement
  3. Buildings, Facilities and Properties
  4. Community Services
  5. Finance
  6. Human Resources
  7. Information Systems and Service
  8. Infrastructure
  9. Land Administration
  10. Legal Matters
  11. Legislative and Regulatory Affairs
  12. Parks Administration
  13. Planning and Development
  14. Protective Services
  15. Recreation and Culture
  16. Transportation and Transit Service

The lower level folder structure would be specific to each department. To accomplish this, the consultant conducted a file and document analysis and interviewed staff in each department.

In the LGMA classification schedule, each primary has a “general” secondary. Wherever possible, this was eliminated to encourage staff to file in an appropriate secondary without a default option.

An absolute limitation on tree depth was imposed – function, primary, secondary and optional folder. Figure 4 shows the folder structure for grants received from organizations.

 

While staff had access to and could create files within any pre-existing secondary or optional folders they could see, any modification, addition or deletion to the folder structure would be made by the Records Management Team. The team consisted of the Record Management Technician, Corporate Officer and Deputy Corporate Officer. The ability to create subfolders was removed to ensure that the N: drive would not have the proliferation of subfolders that the H: drive had. There were some exceptions when the volume of subfolders required daily was high – such as building, zoning and development permits. Subfolder creation requests were sent to the Records Management Helpdesk and the final decision rested with the Records Management Team.

As previously mentioned, some modifications of lower level folders were required to accommodate departmental needs. Figure 5 shows the modifications necessary for zoning applications – very complex items that require several subfolders. Using the traditional LGMA structure, subfolders would not be possible as that would violate the rule “four levels of folders”.

Figure 6 shows the modifications required to accommodate the segregation of operations, which allowed subfolders for each waste water treatment plant. In this case, while each waste water treatment plant operates independently, each operation is essentially the same. Therefore, each secondary has the same subfolders.

 

Project Issues

Email

A vast amount of business knowledge is contained in email and users tend to hoard them, in keeping with the adage, “If in doubt, keep it.” This creates problems when there is a conflict between policies and corporate needs. There is also the perception that emails are ‘different’ from other electronic documents. And, classifying email is hard – there can be a rich variety of content in a chain of messages, making it difficult to identify a principal classification.

The SCRD has regularly conducted training for users on specific rules for how and who should be saving emails:

  • The originator of an internal email would save it in the N: drive. Recipients would delete the email;
  • Recipients of external email would save the incoming email in the N: drive. If there was more than one recipient within the SCRD, the first person named would save it;
  • All emails would be saved in the Outlook or .msg format to preserve headers and metadata;
  • Inboxes had a 500 Mb limit applied to encourage saving in the N: drive. If the limit was exceeded, emails could be received but not

Unfortunately, this training and the specific rules have produced low quality results when there are large numbers of emails to move into the shared drive. Without automation, it will be difficult to improve.

Originally the steering committee decided that when sending emails internally, hyperlinks were to be used and any attachments would be stripped. This was to decrease the amount of file duplication as the attachment lingered in the sender’s “Sent Items” as well as the recipient’s “Inbox” and possibly already existed in the N: drive. A Microsoft Exchange transport rule can easily be built to block attachments. It was also determined that many system functions operate via attachments, such as Calendar appointments and vCards (virtual business cards in Outlook), so the transport rule was built with a size threshold. The result: the Exchange transport rule was considerably more complex than originally anticipated.

Discussion with the working committee determined that attachments were necessary for consistency if an email was sent internally and externally. This often occurs when distributing agenda packages to SCRD committees as well as when sending certain types of information. Therefore, the rule to strip attachments for internal emails was relaxed. The staff were trained to include a hyperlink, but if there was an attachment it was allowed to go through. The sender received an automated warning and a copy of the email was placed in an inspection folder for later review to measure compliance.

NTFS Permissions

Microsoft New Technology File System, (NTFS) was introduced in 1993. It is the suite of mechanisms by which Microsoft ‘Server’ operating systems, and now all versions of Windows, store and retrieve persistent data, such as files on disk. NTFS includes an extremely flexible permissions or ‘access control list’ framework that allows for almost any conceivable arrangement of access and change control.

However, their complexity massively increases the risk of unintended consequences, especially when maintaining them manually. Mistakes in permissions create holes; users will discover them and probably will not mention what they have discovered. Essentially, the user may believe, “The system didn’t prevent me, therefore it must be permitted”. It is necessary to ensure permissions are structured so they can be applied consistently:

  1. Do not allow ad-hoc exceptions, if at all
  2. Make the permissions predictable and pattern
  3. Use tools such as Somarsoft DumpACL, Hyena, or Powershell to audit the file system permissions after they have been established, if building the permissions
  4. Leverage inheritance (see Figure 7).

Restricted Folder and Files

Determining how to handle the SCRD’s confidential documents required long discussions. Human Resources’ internal records had to be restricted to Human Resources, but how could access be restricted to personnel documents that managers create? Should the manager of one department be able to see the personnel documents relating to another department? What about supervisors? How could litigation, claim and accident files be efficiently managed when access would depend on who was involved?

Ultimately “restricted” functions, which paralleled the unrestricted functions, were created. Following permission management best practices, access to those “restricted” folders was explicitly granted to job roles, rather than named users, to reduce ongoing maintenance risks.

It was made clear to users that just because they did not want anyone to “see” a document, it did not make the document confidential. FOIPP was used as the guide to identify what was confidential.

This structure had the added advantage that the Microsoft Search indexer could reliably exclude all the restricted files and a second, separate index could be created that only includes them.

Figure 8 shows the structure in Legal Matters (Restricted) function; only the Chief Administrative Officer (CAO) and the Corporate Officer have top level permissions. Access to the subfolders would depend on who is involved and permissions would be set on a case by case basis. For example, the Transit Supervisor has access to the Transit accidents but not the Fleet accidents.

Understanding and leveraging NTFS ‘permission inheritance’ is crucial – defining them at the right point in the folder tree minimises maintenance effort and risk of errors.

 

 

Final Top-Level Folder

With the restricted folders, the final result was 24 top-level folders:

  1. Administration
  2. Administration (Restricted).
  3. Assets and Procurement
  4. Assets and Procurement (Restricted).
  5. Buildings, Facilities and Properties
  6. Community Service
  7. Finance
  8. Finance (Restricted.)
  9. Human Resources
  10. Human Resources (Restricted).
  11. Information Systems and Service
  12. Infrastructure
  13. Land Administration
  14. Legal Matters
  15. Legislative and Regulatory Affairs
  16. Legislative and Regulatory Affairs (Restricted).
  17. Parks Administration.
  18. Planning and Development
  19. Protective Service
  20. Protective Services (Restricted).
  21. Recreation and Culture
  22. Recreation and Culture (Restricted).
  23. Transportation and Transit Service
  24. Transportation and Transit Services (Restricted).

 

Training

Prior to the cut-over to the N: drive, all staff were required to attend Managing Information in N: Drive Training (MINT) – and were appropriately rewarded with mints, chocolate mint cookies and mint tea! These rewards were enthusiastically received by staff and similar marketing tools have been used effectively for other RIM training.

Support from senior management for this training was crucial; without attending MINT staff would be lost once the N: drive went live. Once staff attended the training, IT provided them with read-only access to the N: drive so they could explore and become familiar with it.

There was some push-back with a few staff members and certain managers. Some of the concerns were legitimate, such as lost productivity, or that there was no budget to bring in relief staff. However, some staff simply refused to do the training because they did not see any value in the project. To ensure the success of the project, it was decided that training was mandatory and those who continued to refuse training would not be using a computer.

Training was done by department, which allowed the training to focus on the department’s specific questions.

To accommodate all staff:

  • Training was scheduled over a six week period and training space was dedicated to the project;
  • Presentations and discussions were used, as learning the concepts was the focus, not how to use Windows;
  • Staff from remote sites came in to the main office; and
  • Staff working outside the main office hours were specially scheduled to be able to attend.

The training included:

  • A basic overview of records management and legislation requiring the management of SCRD records;
  • Why the shared drive needed to be cleaned up;
  • New rules for the new drive;
  • Requirement to use hyperlinks instead of attachments when sending internal emails;
  • Naming conventions;
  • An overview of the classification schedule;
  • How to classify their documents; and
  • Instruction on basic Windows concepts (hyperlinks, shortcuts and searching).

All attendees received a cheat sheet to help them find where to save files and a list of acronyms and abbreviations commonly used by the SCRD. The training, cheat sheet and acronyms were also available on our intranet.

 

Putting it Into Practice

Pre-cutover Activities

Because the H: drive had never been purged, departments were encouraged to review existing files and clear out “the garbage”. Several departments were enthusiastic about the change and some even rearranged their folders on the H: drive to reflect the new structure. Some departments dedicated specific staff to purging and preparing to move files, some said everyone would be responsible for their own files, and some hoped the whole thing would just go away!

After training, any time an attachment was emailed internally, the sender received an automated warning message.

Prior to this project, IT had been using conventional ‘roaming profiles’, whereby a user’s personal files (My Documents etc.) were copied down from a central server to the desktop they were logging on to, and back at the end of the session. This was upgraded to ‘folder redirection’, whereby user’s files were always saved directly to a central server. This allowed implementing file system quotas on personal folders, as a pre-emptive strike on hoarding.

 

Point of No Return and Go-live Pain

At 11:50 pm, February 29, 2012, the H: drive permissions for all users were reset to Read and Delete. At 12:01 am, March 1, 2012, after a very large deep breath, the Read-only flag was removed from the N: drive.

All users were now able to create/read/write files on the N: drive. All users were also able to read/delete anything on the H: drive, with the exception of Human Resources, certain databases, and the “Teams” folders.

On the first day after cutover, the Records Management Helpdesk had over 250 requests, not all with a professional tone. As the SCRD only has 250 employees, the number of requests the first day was, essentially, a one to one ratio to the number of staff. Those staff that had not yet taken the training couldn’t understand why they could not create subfolders anymore. People had trouble finding files and several demanded that the H: drive be re-instated. The Corporate Officer, in response to the first such demand, suggested that, as it had only been 45 minutes, they might want to wait a little longer. In addition to staff’s frustrations, we found there were some permission errors.

During the first week, special ‘one on one’ training was done with staff who had not attended MINT. As they were already frustrated, they were not very receptive to the changes. However, this special training did contribute to the decline in the number for helpdesk requests. By the end of the first week, the number had dropped to about 75 per day.

Many users had problems conceptualizing that they did not need subfolders. The steering committee felt that if files were named correctly staff could use Windows Explorer to find what was needed.

Many departments had not provided a list of required subfolders, such as the need for vendor names in several secondaries. Had the list been provided prior to cut-over, the subfolders could have been seeded. After cut-over, the Records Management Technician had to create the folders individually and then set permissions on each folder. In Restricted folders setting permissions could take hours or days.

 

Resistance – Futile but Frustrating

There were some staff who continued to resist the implementation believing that their position/job duties allowed them special privileges – such as being able to have more than four levels in a folder tree or being able to save to their C: drive. In these instances the parties and their managers were reminded of the new policies and procedures.

Other staff began using My Documents for storing their files. To discourage this practice, a 500 Mb soft limit was implemented. IT ran regular reports to find the offenders.

  • Over 400 Mb – the offender was given a warning email, cc to RIM;
  • Over 490 Mb – another warning email was sent, cc to manager and RIM;
  • Over 500 Mb – RIM discussed the problem with the

There were some staff that consistently exceeded the 500 Mb limit, despite having been spoken to by their managers. In those cases, the violator’s account was disabled until the user met with RIM and IT to: 1) determine the ongoing problem, and 2) provide extra training to the violator.

 

Demise of H: Drive

As the migration of files to the N: drive and the purge of the H: drive progressed, staff were kept informed of the progress via the intranet. This was to try and help them see the “big picture” as time passed. Figure 9 is an example of the update provided to staff.

No statistics were kept as to the number of records deleted from H: drive vs. those moved into N: drive. Approximately 98% of the folder growth in N: drive was due to the migration from H: drive and only 2% could be attributed to new folder creation.

As the date for the demise of the H: drive approached, staff tried to circumvent the system and they were pretty creative about it. As the Corporate Officer stated, “Never underestimate the creativity of those who wish to circumvent the system.” Instead of moving required documents into the N: drive, some uploaded them to an FTP site or moved them to their C drive; and some even moved them to the staff photos folder on the intranet. Unfortunately, sometimes documents were lost before IT or RM became aware of the attempt at circumvention.

Some staff also started saving documents to USB keys, CD’s and removable hard drives. It was made clear that this was unacceptable. Senior managers were very supportive in making their staff move the contents into the N: drive. However, if they did not know it was happening, it was difficult to stop.

Staffing became an issue as well. Some managers advised that their staff were too busy to deal with the H: drive and there was no budget to bring in extra staff to assist with moving the files.

These were valid issues, but they were the same ones faced by all departments. Proactive departments assigned staff to work on the move over a period of time.

As the deadline approached, managers were given some options to assist with the staffing issue:

  • Assign the task of moving the documents to a staff member and arrange one on one attention with the Records Management Technician; or
  • Ask the Records Management Technician to do the work – but it would be outside her normal hours and the department would be responsible for paying the

No manager chose either of these options.

There were people who simply refused to recognize there was an issue. In one department, the manager advised that all the necessary documents had been moved and what remained could be deleted. When the steering committee checked with department staff, they were advised there was a huge volume of documents that still needed to be reviewed.

Originally, the timeline to unplug the H: drive was six months after implementation of the new shared drive. After two extensions November 30, 2012 was the date that the H: drive would be disabled. A week prior to the deadline, a further extension was requested. As the options for helping to move files had not been used, the CAO decided the deadline would stand.

It was assumed that by the November 30th date the H: drive would be empty – all necessary files would have been migrated to the N: drive and any ROT (redundant, obsolete or transitory) would have been deleted. This was a completely erroneous assumption. On December 1, 2012, a review of the H: drive showed some important records remaining, including some critical to ongoing litigation. The project team took on the responsibility of moving those documents to the N: drive.

 

Surprises After H: Drive Disappeared

There were some unpleasant surprises after the H: drive disappeared:

  • While the volume of requests to the Records Management Helpdesk diminished dramatically, it did not dissipate. Subfolders still needed to be created on a regular basis, especially new case files. That demand continued to put pressure on RIM as it was added to the already long list of regular duties;
  • Because some staff were still not well versed in Windows and some staff still thought they were “special”, the time invested with those individuals remained high;
  • Requests continued to be made to search the H: drive for specific files. Unless it was related to litigation or some other pressing issue, those requests were denied;
  • Some staff continued to resist the N: drive, preferring to use removable mass storage. When discovered, sanctions were taken to rectify the situation. However, short of disabling everyone’s ability to use removable devices, no solution was found to eliminate this

 

Lessons Learned

This was the first corporate-wide RIM and IT project. As such, there were many lessons learned.

Marketing/Change Management

There needed to be more comprehensive project marketing to senior management, managers and staff. All three levels needed to understand the reasons behind the project, the benefits to them, what the project’s goals were and how those goals would be achieved.

In addition, for most people, a uniform file and classification system is an abstract concept. More education on why the structure was being imposed as well as more of a focus on how the structure worked would have eased much of the users’ frustrations.

 

Senior Management Buy-In

The degree of buy-in at the senior management level varied considerably based on the individual’s past experience. If they did not see the value, there was no reason to motivate their staff to do the work. In addition, the makeup of the senior management team changed during the project.

Those departments that had functional and well protected “walled gardens” did not see the need for change. However, for those who were on the outside of those “walled gardens”, the need for change was self-evident.

Withdrawal of support part way through would have sunk the project. Being more aware of those individuals who weren’t supportive of the project would have helped. An absolute necessity was the support of the CAO to back the project when complaints were received about the extra work being required.

 

Time and Budget Allocation

Time allocations needed to be a priority – up-front and visible dedication, not “borrowed” from other existing allocations. Staff with domain expertise needed to be assigned to work on the project and their day job transferred elsewhere. Unless the staff member supported the project, or they were dedicated to it, working off the side of their desk automatically made the project a low priority task.

Staff costs and downtime should have been budgeted. Managers were expected to draw on funding that had already been defended to the SCRD’s Board of Directors on the basis of other work commitments. This project was outside the scope of each department’s Human Resources plan. There was a misconception by a few managers that, in essence, this project ‘stole’ money from their budgets without their active participation and in doing so, led to frustrations and resentment.

 

Departmental Representatives

The representatives on the working committee needed to have been chosen more carefully. Most managers appointed one of their clerical staff. In some cases that was the best decision, but in other cases, their clerical staff were: a) not sufficiently computer literate, b) did not fully understand the implications to their department, or c) were not aware of all the information their department needed or produced. This led to the creation of folder structures and classifications that did not meet the department’s needs and caused frustration for their users; the steering committee should have sought approval from each manager prior to finalizing the folder structure.

 

Lack of Familiarity with Windows

Most staff used Windows tools on a daily basis but were unfamiliar with much of its functionality. Concepts such as shortcuts, hyperlinks, searching and ‘scopes’ were unfamiliar to a large percentage of staff, which was surprising to the steering committee. Therefore, the N: drive training was expanded to include some of these concepts and staff were encouraged to take the Windows course offered through the SCRD’s corporate training program.

 

Consistency with Similar Records

Different departments managed the same records differently:

  • Human Resources treated criminal record checks as part of an employee file, with FOIPP obligations satisfied;
  • Checks on volunteers were managed by individual departments – some recorded the check as being satisfactory, and then destroyed the document. Others retained the original document;
  • Some volunteers were also employees – the same document would exist twice, and would be managed inconsistently based on the ‘hat’ the individual was wearing at the time the check was performed.

These differences still need to be explored and processes put in place so the same type of document is treated consistently throughout the organization.

 

Restricted Folders – Fine Grained Access and Folder Traversal

FOIPP requires that confidential information protected by the act must not be disclosed. Every claim, accident and litigation potentially involves a unique set of staff and the resulting confidential information that must be protected in order to comply with FOIPP. Therefore permissions on restricted folders needed to be assigned on a case by case basis. Certain permissions were implicit – for example, the Purchasing Officer would have access to bids in progress. However, the vast majority of restricted folders have unique and distinct permissions. The N: drive model required an inordinate volume of folders to implement the appropriate segregation of permissions.

Folder traversal, or clicking through subfolders, in restricted folders was disabled to prevent information leakage via folder names. This prevented ‘browsing’ into permitted folders.

Therefore, unless a user had their permissions set at the top most branch of a folder tree, they were unable to navigate to the restricted folder they needed. The user had to be provided a shortcut which allowed them to ‘jump’ to the restricted folder. Creating the shortcuts rested with the Records Management Technician and was time consuming.

 

Ongoing Training

Although it was made very clear in the training that ‘Records Belong to the Corporation’, and this position continued to be reiterated with staff, many still struggled with the concept. Some staff didn’t feel the need to follow SCRD naming conventions and some continued to think there was no problem with copying and/or emailing documents for disclosure outside the organization. There was still some confusion as to what constituted a record vs. a transitory document.

Education and training was critical. Every new staff member who would be accessing a computer attended N: drive training (MINT), usually in the first week of their employment. They only had read-access until the training was complete. All managers and supervisors accepted this as part of the on-boarding process. However, there was a need to work with Human Resources to coordinate new hire start dates so training sessions consisted of more than one person at a time.

The MINT sessions for new staff were expanded to include training on protecting privacy, phishing, and managing hard copy records.

There were funds budgeted for corporate-wide records management training to continue to build on the concepts that had been already introduced to all staff. It was important that staff hear the message from a fresh perspective – an outside expert.

 

Locking Out the Locksmith

For all practical purposes, IT staff cannot be kept out of ‘restricted’ file locations. In essence, one cannot lock out the locksmith. This came as quite a surprise to many managers, and it was a point that needed to be driven home.

RIM staff are responsible for managing all structure and performing purges of files, but are not supposed to be aware of the contents (or, theoretically, the existence) of ‘restricted’ files. IT are the administrators of all software, so have access to virtually everything contained within the software.

While RIM and IT staff at the SCRD are union members, essentially they have access to personnel records. As a result, IT and RIM staff have both an ethical and legal responsibility to protect personal and privileged information. It is worth investing in tools and architecture to demonstrate the separation of roles has been maintained:

  • Create distinct ‘Directory Maintenance’ users with full access to all folders, but require RIM staff to log in as a separate Directory Maintenance user to perform structural modifications and purges; and
  • Deploy file access auditing software such as Netwrix or Varonis, to track Create/Read/Update/Delete events on restricted folders by any user, including the system administrators.

These proactive steps will help to answer questions about activity by any user or involving any file, not just demonstrate the integrity of the locksmiths.

 

Successes

Despite the challenges, the restructure project had many successes. Although the SCRD did not maintain statistics on duplication and version forks, they were noticeably reduced. For the most part, there was staff buy-in, and in discussions with staff, they found the new structure to be much more intuitive, efficient and easier to navigate.

Switching from assigning specific permissions to named users to assigning users to roles, and granting permissions to those roles, has dramatically improved the ease of user account maintenance for IT. By changing how permissions were assigned, it:

  • Automatically ensured consistent access to information for users who shared the same job function;
  • Created a clearer explanation for why a user might have privileged access; and
  • Reduced the need for both IT and the user to know in advance exactly what they might ultimately need access

Also, by having default-deny restricted folders, managers had to follow proper procedures for managing that access – employees could not be quietly moved to a different job as they likely would not have the access they needed. Once a staff member was moved to a new position, the access associated with their old job fell away. As an example, one of three Infrastructure Services Secretaries could post into the Transit Dispatcher position. Their previous access to records such as in-camera minutes and agendas would fall away, and FOIPP-protected Transit customer information would become accessible, all with two clicks.

The individual file count has shrunk from 460,000 to 202,000 files, with 80% of the reduction occurring in files that had not been modified in more than seven years.

By far, the two biggest successes this project has realized are:

  1. The retention schedule could be applied to electronic
  2. There was a uniform, logical, predictable folder

Below is a representation of the folder structure prior to and after restructure. Note that the “spikiness” has been eliminated and the new structure is consistent and predictable – now never more than four nested folders need to be contemplated when navigating, and no records or individual files will ever be encountered until the 3rd or 4th level.

Next Steps

In July, 2014, the SCRD put out an RFP for an EDRMS. In discussion with the various vendors who bid on the RFP, they agreed the work done on the shared drive would ease the transition to an EDRMS as it was an importable structure.

The lessons learned from the restructure project were incorporated into the implementation plan for the EDRMS and the SCRD went live with “Dr. Know” on May 19, 2015.

 

Conclusion

The SCRD’s restructure project was a great success, moving the organization from chaos to order. We will continue to build on this success, evolving our RIM practices, workflows and training as technology changes.

 

De la promesse à l’accomplissement Les 30 ans d’histoire de la technologie de tenue des enregistrements électroniques

Par Bruce Miller

Voici l’histoire vue de l’intérieur du développement de la technologie de tenue des enregistrements électroniques1, de ses débuts à ses fonctionnalités et à son statut actuels. Le présent document n’est pas qu’une simple leçon d’histoire, mais plutôt l’histoire racontée de manière objective et concrète des succès et des échecs de cette technologie et du chemin parcouru. Je ferai de mon mieux pour prédire la voie que nous devons suivre pour l’avenir, selon la situation actuelle. Merci de garder à l’esprit que j’expose souvent dans ce texte mes opinions, qui ne doivent pas être interprétées comme des faits. Je vous souhaite bonne lecture!
– Bruce Miller

L’histoire du développement de la technologie de tenue des enregistrements électroniques jusqu’à ce jour peut à mon avis être divisée en trois périodes ou stades distincts :

 

Logiciels de tenue des – De 1983 à 2002. Cette période marque la genèse des logiciels de tenue des enregistrements électroniques, les débuts difficiles d’une nouvelle technologie et l’apparition d’une nouvelle norme logicielle américaine qui a transformé à jamais le marché.

enregistrements électroniques – Des enregistrements électroniques, les débuts difficiles d’une nouvelle technologie et l’apparition d’une nouvelle norme logicielle américaine qui a transformé à jamais le marché.

Gestion de documents  – De 2002 à 2012. Au cours de cette période, les principales entreprises de logiciels de gestion de documents se sont mises à incorporer des fonctionnalités de tenue des enregistrements, transformant de ce fait non seulement la technologie, mais également la façon même dont les solutions arrivent sur le marché.

Microsoft  – De 2012 à aujourd’hui. En raison de l’omniprésence à l’échelle du globe de Microsoft dans nos systèmes de gestion, son influence est à elle seule, et c’est le moins qu’on puisse dire, démesurée. La plateforme de gestion de contenu d’entreprise (ECM) SharePoint de Microsoft a commencé à gagner de plus en plus de terrain sur les fournisseurs de plateformes de gestion de contenu d’entreprise établis. C’est au cours de cette période que Microsoft a finalement compris la nécessité d’offrir des fonctionnalités de tenue des enregistrements, à l’instar des acteurs bien implantés du domaine des plateformes de gestion de contenu d’entreprise. Microsoft a fait de la fonction de tenue des enregistrements l’un des fondements de son logiciel SharePoint, malgré le fait que les logiciels concurrents étaient déjà dotés de fonctionnalités de tenue des enregistrements bien établies. Pour le meilleur ou pour le pire, cette démarche a entraîné des changements majeurs dans la technologie des logiciels de tenue des enregistrements et a finalement donné lieu à une innovation sans précédent qui a permis de franchir un obstacle majeur qui nous freinait dans les premiers temps.

Les trois stades du marché

Logiciels de tenue des enregistrements électroniques

Le premier stade débute en 1983. À l’époque, quelques organismes gouvernementaux fédéraux du Canada, dirigés par un groupe du nom de Centre canadien de recherche sur l’informatisation du travail (CITI), ont accordé quelques subventions assez conséquentes à des entreprises de technologie prometteuses afin que ces dernières mettent au point de nouvelles technologies d’informatisation du travail. Chaque subvention était accordée par un ministère différent. Les subventions s’inscrivaient en quelque sorte dans le cadre d’un concours et chacune d’elles devait servir à financer une proposition pour un projet de recherche et de développement. Les propositions soumises étaient jugées et un gagnant était choisi. Ce dernier recevait une subvention additionnelle du gouvernement dans l’espoir d’encourager la mise en marché de la technologie proposée. À cette époque, j’avais terminé mes études depuis un an et j’étais à l’emploi d’une entreprise montréalaise qui travaillait à créer le premier ordinateur portatif au monde, le Bytec’s Hyperion, à Ottawa.

L’entreprise pour laquelle je travaillais s’était vue accorder une subvention par les Archives nationales du Canada (ANC) et le ministère des Communications (MDC). Le ministère des Communications voulait stimuler la croissance du secteur des technologies. Les Archives nationales du Canada désiraient quant à elles se doter de nouvelles capacités technologiques pour gérer les enregistrements électroniques. C’est moi qui étais alors chargé de mettre au point une proposition pour une nouvelle technologie. J’ai donc mis sur pied une équipe composée de jeunes spécialistes des logiciels de mon entreprise et d’intervenants clés des Archives nationales du Canada et du ministère des Communications. Les Archives nationales du Canada dirigeaient les efforts pour définir les exigences. Le ministère des Communications agissait à titre d’exemple de consommateur du gouvernement fédéral de la nouvelle technologie et contribuait à définir les exigences. Cette technologie, nous l’avons appelée Foremost (Formal Records Management for Office Systems Technology). Notre proposition a remporté le concours; toutefois, d’ici à ce qu’un projet officiel puisse être mis sur pied, l’entreprise pour laquelle je travaillais était en train de s’écrouler. L’avenir paraissait bien sombre, si ce n’est que nous détenions une formidable idée de nouveau logiciel.

Mon équipe et moi avons alors décidé de former une nouvelle entreprise pour mettre au point la technologie que nous avions proposée. Nous avons tous réuni un peu d’argent, et c’est ainsi que notre entreprise est née. Nous avons obtenu du financement de capital-risque pour assurer la continuité de nos activités afin que nous puissions créer ce nouveau logiciel. Les Archives nationales du Canada et le ministère des Communications gardaient espoir que nous puissions un jour fournir le produit commercial dont ils avaient besoin, et ils ont donc continué à nous donner des conseils et des orientations sur les fonctions que nous devions offrir. Le développement du logiciel s’est poursuivi jusqu’à ce que nous puissions en livrer une première version. Entre temps, une autre entreprise de consultants en logiciels d’Ottawa développant un produit concurrent s’était lancée dans la course. Nous avions également eu vent de l’existence d’une deuxième entreprise concurrente, australienne celle-là.

Nous avons ainsi continué, imperturbables, à faire croître notre entreprise, et les Archives nationales du Canada ont testé et validé notre logiciel. Nous commencions à recevoir beaucoup d’attention aux États-Unis, et le monde entier montrait un intérêt croissant, surtout les organismes gouvernementaux, qui en étaient venus à réaliser qu’un nombre sans cesse grandissant de leurs enregistrements étaient en format électronique et que ces derniers devaient faire l’objet des mêmes contrôles obligatoires en matière de tenue des enregistrements que les autres enregistrements. Le directeur des Archives nationales du Canada, John Macdonald (actuellement retraité), était un promoteur clé du projet, et il a sans aucun doute influencé les services des archives d’autres pays grâce à ses efforts en faveur de ce type de technologie. On dit que c’est grâce à lui que les travaux des Archives nationales du Canada sont devenus un facteur déterminant dans l’élaboration ultérieure de la norme en matière de tenue des enregistrements électroniques du Conseil international des archives (CIA).

De mille et une façons, nous étions la petite entreprise de logiciels en démarrage typique. Tout ce que nous faisions était nouveau, du jamais vu. Il est vrai que nous avions des concurrents, toutefois, nous savions très bien que nous avions une longueur d’avance sur eux en matière d’innovation. Notre produit connaissait un développement constant et de nouvelles versions étaient lancées chaque année. Nous commencions à réaliser quelques ventes auprès de premiers utilisateurs, mais aucune vente assez importante pour assurer la rentabilité de l’entreprise, encore moins sa réussite financière. Nous avions soulevé beaucoup d’intérêt, y compris de l’armée américaine et de sociétés d’envergure, mais le fait que nous étions une petite entreprise en démarrage canadienne a découragé plus d’un acheteur potentiel.

En moins de deux, une décennie s’était écoulée. Nos concurrents devenaient de plus en plus solides et ajoutaient de nouvelles fonctionnalités, tout comme nous d’ailleurs. Notre atout, c’était le financement en capital-risque : nous disposions de suffisamment de capital pour bâtir une équipe de développement solide axée sur la création de nouvelles fonctionnalités innovantes sans dépendre des revenus provenant des ventes.

Puis, il y a eu la guerre du Golfe. Les anciens combattants de retour de mission se plaignaient d’une maladie jamais observée auparavant qui vint à être connue sous le nom de « syndrome de la guerre du Golfe ». Quel est le rapport entre le syndrome de la guerre du Golfe et la tenue des enregistrements électroniques? Il s’est avéré que ces deux éléments sont très liés. Les anciens combattants de la Guerre du Golfe avaient intenté un recours collectif contre l’armée américaine dans le but de recevoir une indemnisation. Bien que je ne me souvienne pas de l’issue du litige, je sais qu’il a coûté cher à l’armée américaine en argent des contribuables. Le Pentagone est par la suite arrivé à la conclusion que l’incapacité de reconnaître la maladie provenait en grande partie de la destruction des dossiers médicaux électroniques des militaires sur les ordinateurs de l’armée. Dans le but de prévenir la perte de dossiers médicaux électroniques à l’avenir, le Pentagone a exigé de toutes les agences militaires américaines qu’elles gèrent de manière adéquate leurs dossiers médicaux électroniques. Afin d’accélérer la transition vers la tenue des enregistrements électroniques, le Pentagone a développé une norme électronique obligatoire, US DoD-5015.02-STD, qui spécifie que tous les logiciels de tenue des enregistrements électroniques utilisés pour le stockage d’enregistrements électronique doivent respecter les exigences énoncées dans cette norme.

Le Pentagone est allé plus loin encore. Ils ont pris la décision audacieuse de mettre sur pied un organisme d’essai pour certifier les produits logiciels des fournisseurs dans le but d’assurer que les exigences très strictes de cette nouvelle norme sont respectées. Ce nouvel organisme d’essais était basé à Fort Huachuca, en Arizona, sous l’égide du Joint Interoperability Test Command (JITC).

Au fil du temps, cette norme a complètement changé la donne sur le marché. Désormais, l’armée américaine, qui est sans conteste le plus grand consommateur de technologies de tenue des enregistrements au monde, ne pouvait acheter que des technologies respectant cette nouvelle norme. La National Archives and Records Administration (NARA) des États-Unis a adopté la norme et a recommandé que tous les organismes gouvernementaux des États- Unis s’y conforment, ce qui a déclenché une réaction en chaîne. En peu de temps, un grand nombre d’agences d’État et d’agences locales américaines ont adopté la norme. Beaucoup de sociétés ont même suivi le mouvement. Leurs demandes de propositions (DDP) pour l’achat d’un logiciel de tenue des enregistrements comprenaient la conformité obligatoire à la norme 5015.02. Cette tendance s’étendra par la suite au Canada et en Europe. D’après mes estimations, près de la moitié des demandes de propositions de l’époque exigeaient obligatoirement la conformité à la norme 5015.02.

Seulement voilà, un problème se posait.

Aucun logiciel ne pouvait être testé. En effet, aucun logiciel conforme à cette nouvelle norme n’avait jamais été écrit. Mon équipe et moi avons apporté des contributions substantielles à l’équipe chargée du développement de la norme. Toutefois, le JITC prenait grand soin de demeurer neutre par rapport aux fournisseurs. Je me souviens d’avoir eu maintes discussions au sujet de ce qui faisait partie du domaine du possible par rapport à ce qui faisait partie du domaine du pratique. La norme qui en résulte comportait une quantité impressionnante de capacités inédites. J’étais d’avis que notre nouveau logiciel respectait la majeure partie des exigences de base, mais un travail considérable devait être accompli pour respecter la norme. La course était lancée pour obtenir la certification à cette nouvelle norme. Nous étions loin d’être la seule entreprise dans la course : toutes les entreprises du domaine des logiciels devaient maintenant obtenir cette certification au risque de perdre d’importantes parts de marché.

J’étais conscient que notre entreprise jouait le tout pour le tout – il fallait aller de l’avant. Nous avons donc fortement augmenté notre financement de capital-risque pour doter notre équipe de davantage de compétences et de ressources et nous la date du premier test de certification de notre logiciel à Fort Huachuca à été fixée.

Encore une fois, un problème se posait.

Comment le JITC allait-il tester un logiciel qu’il n’avait jamais vu avant? Des procédures de test devaient être écrites et, pour ce faire, le JITC devait connaître ce qu’il testait. Or, il n’avait jamais vu de logiciel de ce genre. Le JITC était novice dans la tenue des enregistrements, sans parler de ce nouveau type de logiciel. Pour résumer en quelques points cette histoire, j’ai visité à de nombreuses reprises les laboratoires de Fort Huachuca, à la frontière mexicaine, sur une période de plusieurs mois. Je m’étais habitué à fréquenter les gens du laboratoire, et nous nous connaissions bien. Fort Huachuca, base militaire active équipée d’un aéroport de drones, était une base militaire américaine très animée tout à fait typique et je me suis bien entendu avec les nombreux autres entrepreneurs et civils qui y travaillaient avec des permis de jour.

Une fois les procédures de test finalisées, le grand jour du test de notre logiciel était enfin arrivé – le tout premier logiciel que l’équipe du JITC allait tester. Ce jour-là, je me suis présenté pour rejoindre l’équipe du labo, comme à mon habitude. La sécurité était d’une rigueur inhabituelle. Depuis ma dernière visite au labo, les États-Unis avaient déclenché la deuxième Guerre du Golfe. Je l’ignorais, mais toutes les bases militaires rehaussent le niveau de sécurité quand le pays est en guerre. J’ai été stoppé à l’entrée par la police militaire et on m’a demandé de montrer ma carte d’identité. Avant cette journée, mon passeport canadien était la seule pièce d’identité que je devais présenter, mais les choses avaient changé. Le jour du test, on m’a escorté hors du désert de l’Arizona, mon passeport canadien à la main. Mon ordinateur portatif contenant toutes les données et les procédures de test a été confisqué, et les données effacées pour toujours. Je ne serai jamais vraiment en mesure de raconter précisément comment nous nous remis de cette mésaventure, mais je me contenterai de dire que l’histoire implique des gens sympathiques de l’armée américaine, le désert brûlant et beaucoup de Poulet frit Kentucky et de Coca-Cola sur une période de plusieurs jours.

C’est ainsi que le tout premier logiciel certifié 5015.02 est apparu. Les entreprises de logiciels faisaient la queue pour pouvoir fixer la date de leur test de certification auprès du JITC et la liste d’attente était de deux ans. Le monde des logiciels de tenue des enregistrements électroniques n’a plus jamais été le même. La certification 5015.02 était devenue l’objectif principal de tout logiciel. Cette norme obligeait tous les fournisseurs à doter leur logiciel de fonctionnalités bien précises, au risque de perdre des parts de marché. Elle a au bout du compte façonné l’architecture et l’ensemble des fonctions globales de tous les logiciels certifiés. Notre concurrent local d’Ottawa a par la suite obtenu la certification, de même que notre concurrent australien.

Une fois que la norme américaine a été bien établie, l’Australie a suivi avec sa propre norme, Victorian Electronic Records Strategy (VERS). Ce n’est que plusieurs années plus tard que les Australiens ont développé un régime de tests de logiciels à l’appui de cette norme. D’autres normes ont commencé à apparaître :

  • Module 2 de l’ICA, Principes et exigences fonctionnelles pour les enregistrements dans les environnements électroniques (Principles and Functional Requirements for Records in Electronic Environments). Cette norme est par la suite devenue la norme ISO 16175-2:2011;
  • MoReq (MoReq1, MoReq2). Prévue à l’origine comme norme européenne.

Il m’a semblé que toutes ces normes cherchaient d’une façon ou d’une autre à dépasser ou à tout le moins à compléter la norme américaine. À l’époque, je considérais que la norme US DoD 5015.02 était absolument essentielle pour réaliser des ventes sur le marché américain. La norme MoReq semblait à mon avis demander la lune du logiciel, comme la capacité de « préserver la lisibilité des documents électroniques pour toujours ». Bonne chance! J’ai eu bien du mal à prendre cette norme au sérieux, et je n’ai jamais rencontré un seul acheteur qui exigeait la conformité à cette norme. J’avais beaucoup d’admiration pour la norme VERS étant donné qu’elle est fondée sur la norme 5015.02, mais il faut garder à l’esprit que seuls les acheteurs australiens s’en préoccupaient, tandis que j’étais à mon avis, tout comme la plupart de mes concurrents, plus orienté vers le marché américain.

Qu’en est-il du Canada? Le gouvernement fédéral, par l’entremise du Conseil du Trésor du Canada, avait fini par choisir le Module 2 de l’ICA comme norme pour les logiciels du gouvernement du Canada. Cette norme est fort différente de la norme 5015.02, en ce sens qu’elle est rédigée du point de vue de l’archiviste, et elle est donc surtout axée sur les capacités à l’appui de la préservation des enregistrements. La norme de l’ICA n’était tout simplement pas à ce point centrée sur le cycle de vie actif des enregistrements, et la majeure partie des spécifications se situaient en aval du cycle de vie active. En outre, la contribution à cette norme provenait en majeure partie des pays du Commonwealth, et c’était très manifeste. Cette norme comprend le vieux système numérique par blocs ainsi que les niveaux de sécurité (Confidentiel, Secret, Protégé, Très secret…). Le Module 2 de l’ICA exige un nombre époustouflant de 275 capacités, contre 168 capacités pour la norme 5015.02, dont certaines, un peu comme c’est le cas pour la norme MoReq1, représentaient simplement un idéal et étaient impossibles à intégrer avec la technologie de l’époque.

Je ne plaide aucunement en faveur d’une norme par rapport à une autre; toutes les normes ont leur utilité. Toutefois, je m’inquiète quand je vois le mot « devrait » dans une norme de logiciels. Il est impossible de tester une fonction de logiciel par rapport à une exigence contenant le mot « devrait ». À mon sens, une norme de logiciels n’en est véritablement une que lorsqu’il est possible de la tester par rapport à la technologie du monde réel. Toutes les autres normes représentent une sorte d’idéal, et aucun fournisseur de logiciels ne devrait déclarer un logiciel conforme à une norme sur cette base, bien que certains le fassent.

Comment les entreprises de logiciels de tenue des enregistrements électroniques s’en sortaient-elles? Des trois entreprises que j’ai mentionnées jusqu’à maintenant, l’entreprise australienne était celle qui s’en tirait le mieux, étant donné qu’elle semblait vendre un plus grand nombre de logiciels, bien qu’essentiellement dans les pays ceinture du Pacifique, et cette entreprise affichait une croissance plus vigoureuse et plus rapide que les autres. Elle s’était concentrée sur le marché local australien et sur le marché européen, et tout allait bien pour elle. Cette entreprise a crû jusqu’à atteindre un effectif d’environ 35 personnes. L’entreprise australienne avait obtenu la certification US DoD 5015.02 sur le tard, mais elle a fini par établir une tête de pont aux États-Unis en embauchant un ancien pilote très respecté de l’armée de l’air américaine, qui était devenu gestionnaire du programme des dossiers militaires, pour la représenter aux États-Unis. L’entreprise australienne offrait surtout des solutions axées sur les documents physiques (chemises de classement papier, boîtes). Cette entreprise m’a toujours semblé reléguer la gestion des enregistrements électroniques au second plan. Notre concurrent local d’Ottawa vendait lui aussi surtout des solutions de tenue des enregistrements en format papier, la tenue des enregistrements électroniques étant accessoire. Cette entreprise a obtenu assez rapidement la certification 5015.02 et elle a remporté un appel d’offres du gouvernement canadien dans le cadre duquel le plus bas soumissionnaire l’emporte pour un système qui allait être connu sous le nom de SGDDI (Système de gestion des dossiers, des documents et de l’information)2. Malgré notre déception, nous étions convaincus que notre avenir était lié au marché américain et non pas au marché canadien. Contrairement à nos concurrents à l’époque, nos solutions étaient entièrement axées sur les enregistrements électroniques, les documents physiques faisant partie des fonctionnalités secondaires.

Pourquoi donc n’avons-nous pas mis l’accent sur les documents sur papier? Parce que notre mission était de fournir des solutions pour les enregistrements électroniques et non pas pour les documents sur papier. Le monde était inondé de solutions de gestion de documents sur papier satisfaisantes. Nous sommes demeurés entièrement fidèles à notre mission.

La première certification 5015.02 a galvanisé l’industrie des logiciels de tenue des enregistrements électroniques tout entière. Les acheteurs éventuels exigeaient la conformité à cette norme, à défaut de quoi il était impossible de vendre des logiciels dans la majeure partie du marché américain. Et pendant un certain moment, seule une petite entreprise de logiciels canadienne se conformait à cette norme. Soudainement, les grandes entreprises de logiciels de gestion de documents tendaient l’oreille…

 

Gestion de documents

Au même moment où le marché des logiciels de tenue des enregistrements électroniques se développait de 1983 à 2002, un marché parallèle émergeait : la gestion de documents. Les produits de gestion des documents offraient un répertoire organisé pour tous les documents électroniques créés et de puissantes capacités de recherche pour que les utilisateurs puissent trouver les documents dont ils ont besoin. Il était en effet absurde que tous les utilisateurs d’une entreprise stockent leurs documents uniquement dans les ordinateurs individuels. La demande pour la gestion de documents était croissante. Il ne s’agissait pas d’un petit marché comme celui de la tenue des enregistrements électroniques – on parle ici d’un marché absolument gigantesque. À partir de l’année 2002, l’usage de ce type de logiciel était devenu généralisé dans les entreprises à travers le monde. À l’époque, comme c’est le cas aujourd’hui, il semblait que pratiquement toutes les sociétés comptant en leur sein plus de quelques centaines d’utilisateurs avaient besoin d’un système de gestion de documents.

Le secteur de la gestion de documents était en croissance, tant sur le plan de la maturité de marché que des capacités et, à partir de 2002, il s’était transformé en « gestion de contenu », ou, plus précisément, en gestion de contenu d’entreprise (Enterprise Content Management ou ECM). IBM avait sa propre plateforme, Content Manager. OpenText s’était lancée dans ce marché avec sa plateforme de gestion de contenu d’entreprise LiveLink et, grâce à plusieurs acquisitions, dont celle de l’entreprise canadienne Hummingbird, a pu faire progresser sa plateforme de gestion de contenu d’entreprise Content Server. FileNet avait son produit de gestion de contenu d’entreprise très connu, FileNet P8, surtout axé sur la gestion des images. Mais le précurseur était Documentum, une entreprise californienne. Grâce ses solutions de gestion de contenu d’entreprise à grande échelle pour les sociétés pharmaceutiques d’envergure partout dans le monde, Documentum dominait complètement le secteur pharmaceutique. Si c’était gros, c’était bien souvent Documentum. On ne peut pas dire qu’IBM était à la traîne non plus. L’entreprise avait des installations partout dans le monde, notamment chez un client très connu, la Social Security Administration des États-Unis, qui comptait plus de 200 000 utilisateurs. Hewlett Packard (HP) a par la suite pénétré le marché des plateformes de gestion de contenu d’entreprise grâce à l’acquisition d’Autonomy. Il importe d’insister sur le fait que ces « nouvelles » plateformes de gestion de contenu d’entreprise ne s’inscrivaient en fait que dans l’évolution continue de la gestion de documents, la seule différence étant leurs noms plus accrocheurs et leurs capacités sans cesse grandissantes. Le secteur des logiciels de tenue des enregistrements était quant à lui un secteurde marché spécialisé minuscule exploité par un petit nombre de fournisseurs de logiciels de taille relativement modeste. Ces deux marchés ne se chevauchaient pas encore de manière importante. La gestion de contenu d’entreprise était la gestion de contenu d’entreprise et la gestion des enregistrements était la gestion des enregistrements, tout simplement.

Les fusions étaient incessantes dans le secteur de marché des plateformes de gestion de contenu d’entreprise. Documentum a été acquise par EMC, et EMC a depuis été acquise par OpenText. IBM a quant à elle fait l’acquisition de FileNet.

Ces sociétés n’ont accordé que peu d’attention au marché de la tenue des enregistrements électroniques. Elles n’avaient après tout aucune raison de le faire. Toutefois, avec la venue des premières certifications 5015.02, la conformité à la norme US DoD 5015.02 apparaissait de plus en plus souvent comme exigence obligatoire dans les demandes de propositions auxquelles elles répondaient. Plusieurs acheteurs et clients potentiels du gouvernement américain leur répondaient « Nous ne pouvons plus acheter vos produits à moins qu’ils soient conformes à la norme 5015.02 ». Au Canada et sans doute dans d’autres pays, la conformité à la norme 5015.02 faisait partie des exigences de bon nombre de demandes de propositions relatives à la technologie de gestion de contenu d’entreprise. Il y avait toutefois un problème…

Les fournisseurs de gestion de contenu d’entreprise ne connaissaient pas du tout la norme 5015.02, et ils ont donc été pris complètement par surprise. Deux choix s’offraient à eux : développer ou acheter. Ils pouvaient soit concevoir et développer ces nouvelles fonctions singulières pour leurs produits et, à terme, obtenir la certification, ou alors acheter des technologies existantes et les incorporer à leurs propres produits. Rappelons qu’à ce moment, nous avions déjà passé plus d’une décennie à concevoir, à développer et à perfectionner cette technologie, tout en étant centré exclusivement sur ce créneau spécialisé. Pour obtenir la certification à la norme 5015.02, un savoir considérable, une grande équipe de développeurs hautement qualifiés et beaucoup de temps, d’efforts et d’argent ont été nécessaires. Tout ce travail ne pouvait s’accomplir facilement ou rapidement, même pour la toute puissante IBM.

Les fournisseurs de plateformes de gestion de contenu d’entreprise, IBM, FileNet, Documentum et OpenText, avaient tous besoin de cette certification – et le plus tôt possible, l’absence de certification ayant une incidence négative sur leurs quotas de vente. Il existait trois cibles potentielles d’acquisition : mon entreprise à Ottawa, notre concurrent local d’Ottawa et notre concurrent australien. Quatre entreprises avaient besoin de la certification et il n’existait que trois petites entreprises spécialisées qui détenaient la technologie convoitée.

Voici donc ce qui s’est passé. J’ai quitté ma propre entreprise et j’en ai fondé une autre pour développer la technologie de deuxième génération que nous estimions être essentielle pour réussir. Mon ancienne entreprise avait été acquise par Documentum. Ma nouvelle entreprise a quant à elle été acquise par IBM. Mon concurrent local d’Ottawa a été acquis par OpenText. Pour ce qui est de l’entreprise australienne, elle décidé de faire cavalier seul. FileNet, laissée pour compte dans ce jeu des acquisitions, n’avait pas d’autre choix que de se doter elle-même de fonctionnalités de tenue des enregistrements, et elle s’est attelée tout de suite à la tâche. Plusieurs années plus tard, notre concurrent australien a finalement été acquis par Hewlett-Packard.

À l’exception notable de l’entreprise australienne dont j’ai déjà parlé, les entreprises de logiciels de tenue des enregistrements électroniques naissantes avaient pratiquement disparu du jour au lendemain, englouties dans la course que se livraient les fournisseurs de plateformes de gestion de contenu d’entreprise pour être conformes à la norme DOD 5015.02 avant leurs concurrents. Les quelques petits fournisseurs spécialisés avaient donc été avalés – il ne restait plus que l’entreprise australienne. C’était maintenant les fournisseurs de plateformes de gestion de contenu d’entreprise qui menaient la barque. Encore aujourd’hui, je suis d’avis que c’est une bonne chose pour le marché.

J’en suis venu à constater que les sociétés devant se conformer à des exigences en matière de tenue des enregistrements n’avaient pas d’autre choix que d’acheter les logiciels répondant à leurs besoins (habituellement, la conformité à la norme 5015.02), ce qui ne signifie pas nécessairement que ces dernières utilisaient les fonctions qu’elles avaient acquises. La tenue des enregistrements n’était désormais qu’une fonctionnalité parmi d’autres de la technologie de gestion de contenu d’entreprise moderne. Ceux qui désiraient se doter de fonctionnalités de tenue des enregistrements électroniques devaient d’abord se procurer une plateforme de gestion de contenu d’entreprise. La tenue des enregistrements était devenue un passage obligé dans la plupart des demandes de propositions pour l’acquisition d’une plateforme de gestion de contenu d’entreprise. Les ventes de logiciels de gestion de contenu d’entreprise continuaient de croître, et cela faisait l’affaire de tout le monde.

Je ne peux pas me prononcer au sujet des autres acquisitions, mais IBM a accompli un travail remarquable pour absorber mon entreprise. L’intégration, qui était de la plus haute importance pour IBM, a été remarquablement bien effectuée. Or, j’étais tout sauf content de la situation. En effet, après quelques années, j’avais remarqué que malgré toutes les ventes et les livraisons réalisées de solutions de tenue des enregistrements électroniques, peu d’entreprises acquéreuses avaient vraiment procédé au déploiement de ces solutions. IBM n’était pas la seule concernée; pratiquement tout le marché l’était. Les logiciels de gestion de contenu d’entreprise étaient déployés avec des fonctionnalités de tenue des enregistrements, mais absolument rien n’indiquait que ces nouvelles fonctionnalités des plateformes de gestion de contenu d’entreprise servaient véritablement à gérer des enregistrements électroniques.

J’ai vu beaucoup d’entreprises qui prétendaient gérer leurs enregistrements électroniques. En analysant plus en profondeur leurs projets, plusieurs points agaçants ressortaient, et ce de manière récurrente. Beaucoup (beaucoup trop) d’entreprises ne savaient absolument pas comment déployer les fonctionnalités de tenue des enregistrements électroniques. Certaines entreprises avaient inclus la tenue des enregistrements dans leur commande de plateforme de gestion de contenu d’entreprise parce qu’elles étaient obligées de le faire, mais le déploiement n’avait jamais été effectué (shelfware). Beaucoup ont tenté de gérer les enregistrements électroniques et ont échoué, et sont retournés à la gestion de documents sur papier seulement, laissant les enregistrements électroniques non gérés. J’ai vu une grande entreprise américaine très connue s’y prendre à trois reprises et dépenser des millions de dollars pour le déploiement de la technologie de tenue des enregistrements avec deux différents fournisseurs de logiciels de gestion de contenu d’entreprise et qui a fini par baisser les bras.

À mon avis, le succès d’un déploiement de la tenue des enregistrements se mesure à deux critères simples. Premièrement, les documents électroniques doivent être gérés adéquatement à titre d’enregistrements sur une base régulière et quotidienne par tous les utilisateurs. Deuxièmement, le gestionnaire des enregistrements de l’entreprise doit procéder à la disposition des enregistrements électroniques conformément au calendrier de conservation des enregistrements électroniques. J’ai entendu maintes et maintes fois des histoires de gestion supposément réussie des enregistrements électroniques. Or, chaque fois que j’ai analysé plus en profondeur ces projets à la recherche de mes deux critères de réussite, je ne les ai jamais trouvés.

Les logiciels de gestion de contenu d’entreprise respectaient de toute évidence les exigences en matière de tenue des enregistrements, du moins celles énoncées dans la norme 5015.02. Bon nombre de fournisseurs de logiciels de gestion de contenu d’entreprise sont allés bien au-delà de la norme 5015.02 et offraient des fonctions comme la recherche de contenu, l’investigation informatique et la gestion de dossiers physiques. La plupart offraient même la gestion de courriels sous une forme ou sous une autre, un élément crucial de la tenue des enregistrements électroniques étant donné qu’une grande partie des enregistrements électroniques sont des courriels. Les plateformes de gestion de contenu d’entreprise offraient de puissantes fonctionnalités. Des entreprises d’envergure mondiale comptant en leur sein des gens hautement compétents fournissaient des solutions de gestion de contenu d’entreprise efficaces.

Nous sommes donc en droit de nous demander pour quelle raison le déploiement de la composante « tenue des enregistrements » de ces solutions de gestion de contenu d’entreprise s’est révélé être un échec alors même que le projet de gestion de contenu d’entreprise sous-jacent était un franc succès. En réalité, il existait un certain nombre de raisons claires et évidentes. Au fil des ans, j’ai commencé à comprendre les causes profondes (les facteurs liés aux projets) pour plusieurs projets, pour différentes technologies de gestion de contenu d’entreprise et pour plusieurs types d’entreprises qui empêchaient la tenue électronique de documents de prendre racine et d’être couronnée de succès. Parfois, un seul facteur pouvait être fatal à la tenue des enregistrements et parfois, il s’agissait une combinaison de plusieurs facteurs. Dans tous les cas, l’issue était malheureuse : la tenue des enregistrements électroniques ne se faisait pas. Voici donc les quatre facteurs liés aux projets que j’ai observés et que j’observe encore aujourd’hui.

 

La faible priorité accordée à la tenue des enregistrements. Le passage vers une plateforme de gestion de contenu d’entreprise à l’échelle d’une société est un projet d’envergure qui est fort coûteux et périlleux, et ce pour toute entreprise. Ce passage constitue un virage important dans l’environnement informatique (TI) d’une entreprise et est extrêmement déstabilisant pour les utilisateurs. Les projets de gestion de contenu d’entreprise sont très coûteux en plus de mettre en jeu la réputation. Leur succès est de première importance, et il y a peu de tolérance pour l’échec. Dans certains cas, les employés des TI sentent que leur réputation et même leur emploi sont en jeu en cas d’échec. Cette obligation de réussir ne s’applique pas à la composante « tenue des enregistrements » du projet, qui n’est qu’accessoire. Il ne viendrait à personne l’idée d’annuler un projet de gestion de contenu d’entreprise si le déploiement de la composante « tenue des enregistrements » n’est pas satisfaisant. Personne ne risque de perdre sa réputation ou son emploi. En cas d’échec de la composante « tenue des enregistrements » d’un projet de gestion de contenu d’entreprise, le projet se poursuit, sans la tenue des enregistrements. À l’échelle du projet de gestion de contenu d’entreprise, l’importance relative de la tenue des enregistrements est faible. Beaucoup d’entreprises ne peuvent se permettre que le projet de gestion de contenu d’entreprise échoue, ce qui rend en pratique la composante « tenue des enregistrements » du projet facultative. D’après mon expérience, et c’est encore le cas aujourd’hui, la tenue des enregistrements se solde par un échec dans un trop grand nombre de déploiements de plateformes de gestion de contenu d’entreprise, les professionnels de la gestion des documents et de l’information au sein des entreprises se contentant de gérer des documents sur papier, les fonctionnalités avancées de tenue des enregistrements électroniques demeurant ainsi inutilisées.

Les fournisseurs de plateformes de gestion de contenu d’entreprise savent très bien qu’ils peuvent vendre leurs produits pourvu qu’ils respectent les exigences. Si le déploiement de la tenue des enregistrements tourne mal, ils ont peu de chance d’avoir à subir la colère d’un acheteur qui, dans de trop nombreux cas, n’a pas une assez bonne compréhension de cette technologie pour savoir comment la gérer.

À mon avis, cette situation est la conséquence non voulue qui découle du fait que la tenue des enregistrements n’est qu’une fonction de la plateforme de gestion du contenu d’entreprise. Au début, tout le monde s’entendait pour dire que la tenue des enregistrements était une fonction absolument essentielle.

Puis est venu la gestion de contenu d’entreprise, ensuite l’échec de la tenue des enregistrements. Alors, doit-on laisser tomber les projets de gestion de contenu d’entreprise et tout recommencer? Bien sûr que non! La gestion de contenu d’entreprise est un choix sûr, toujours sûr, contrairement à la tenue des enregistrements.

 

Les professionnels de la gestion des documents et de l’information sont mal outillés. Certains professionnels de la gestion des documents et de l’information interprètent mal mon avis sur la question et pensent que je les crois incapables de gérer cette technologie, mais rien n’est moins vrai. De nos jours, les professionnels de la gestion des documents et de l’information sont plus versés en technologie, mieux éduqués et mieux équipés que jamais pour gérer ce type de technologie. Beaucoup d’entre nous sont malgré tout encore vulnérables face aux progrès de la technologie de gestion de contenu d’entreprise. Le gestionnaire des enregistrements en entreprise doit avoir une compréhension intime des nouvelles capacités de tenue des enregistrements du nouveau logiciel de gestion de contenu d’entreprise. Ça fait beaucoup à apprendre et à maîtriser, et habituellement dans un laps de temps très court. Ils doivent être entièrement à l’aise avec la technologie de gestion de contenu d’entreprise et se poser les questions suivantes : Comment ça marche? Comment les métadonnées sont-elles définies? Comment puis-je contrôler les valeurs des métadonnées? Comment dois-je gérer les documents en tant qu’enregistrements? Que dois-je exiger des utilisateurs? Ces professionnels doivent en outre intégrer le calendrier de conservation dans le logiciel de gestion de contenu d’entreprise. Autant dire que c’est mission impossible. La plupart des calendriers de conservation ne sont pas structurés adéquatement pour les systèmes de gestion de contenu d’entreprise modernes, une manipulation des données s’imposant à tout le moins ou même une restructuration complète dans certains cas.

Les professionnels de la gestion des documents et de l’information doivent d’abord maîtriser la technologie de gestion de contenu d’entreprise, ensuite maîtriser les capacités de tenue des enregistrements électroniques, puis faire la refonte du calendrier de conservation et l’intégrer et finalement influencer fortement la manière dont la plateforme de gestion de contenu d’entreprise est configurée et déployée afin que la tenue des enregistrements fonctionne correctement. Et tout ce travail doit habituellement être accompli dans une semaine de 40 heures de travail bien remplie en sus des tâches liées aux responsabilités quotidiennes de tenue des enregistrements. C’est donc tout un défi, même pour les meilleurs d’entre nous, et c’est le moins qu’on puisse dire.

Par ailleurs, je suis d’avis que les fournisseurs de gestion de contenu d’entreprises investissement peu de travail et d’argent dans la composante « tenue des enregistrements » par rapport à la plateforme prise dans son ensemble. Je connais même un fournisseur de gestion de contenu d’entreprise qui est passé à deux doigts d’abandonner complètement la tenue des enregistrements parce que cette fonction n’était à son avis

« pas assez importante ». D’après moi, la documentation, la formation, les services professionnels et le service après-vente liés aux fonctions de tenue des enregistrements sont souvent à la traîne par rapport aux autres fonctions de la plateforme de gestion de contenu d’entreprise. Je n’ai aucun doute sur le fait que ce phénomène traduit simplement la priorité accordée aux fonctions de tenue des enregistrements par rapport à la plateforme elle-même, mais il n’en demeure pas moins que pour la plupart des fournisseurs, il y a encore beaucoup de place à l’amélioration dans ce domaine. Au bout du compte, ce facteur constitue encore un autre obstacle à franchir pour les professionnels de la gestion des documents et de l’information pour maîtriser cette technologie.

La faute n’incombe nullement aux professionnels de la gestion des documents et de l’information, mais plutôt à l’ensemble du secteur, y compris aux fournisseurs et aux acheteurs. Le fossé éducatif se creuse plutôt que de diminuer.

 

Échec du déploiement de la plateforme de gestion de contenu d’entreprise. Les déploiements de plateformes de gestion de contenu d’entreprise ne sont pas tous réussis. Je n’ai jamais vu deux évaluations du succès d’un déploiement d’un système de gestion de contenu d’entreprise dans lesquelles le succès est défini de la même manière, encore moins une cohérence des résultats ou des mesures. Il est par conséquent impossible de connaître quelle est la proportion exacte de déploiements réussis de plateformes de gestion de contenu d’entreprise. Ce nombre, quel qu’il soit, n’est certainement pas près de 100 %. En d’autres mots, la plateforme de gestion de contenu d’entreprise est le navire dans lequel la tenue des enregistrements électroniques est transportée. Si ce navire est en mauvais état et qu’il est rejeté par les utilisateurs finaux, il n’y a rien que la tenue des enregistrements puisse faire pour inverser la situation. Les fonctionnalités de tenue des enregistrements électroniques de toutes les solutions de gestion de contenu d’entreprise exigent une adoption généralisée par tous les utilisateurs finaux et ne doivent pas seulement tenir compte des enregistrements électroniques en tant que tels (dont les courriels), mais également des métadonnées appropriées sur lesquelles repose la tenue des enregistrements, sans quoi il est pratiquement impossible de gérer les documents en tant qu’enregistrements en appliquant les contrôles appropriés, en les classant selon le selon le calendrier de conservation et en les disposant à la fin de leur durée de vie utile.

 

Le fossé qui sépare les professionnels de la gestion des documents et de l’information et les professionnels des TI. Parce que la tenue des enregistrements est une fonction de l’écosystème plus large de la plateforme de gestion de contenu d’entreprise, elle ne peut fonctionner en dehors cette dernière. Les enregistrements sont eux-mêmes des documents stockés dans la plateforme de gestion de contenu d’entreprise et sont complètement sous son contrôle. La plateforme doit être configurée pour déterminer les métadonnées à appliquer aux documents. La plateforme de gestion de contenu d’entreprise détermine et applique les autorisations relatives à la sécurité. Elle spécifie où les documents sont stockés en définissant des emplacements de fichiers comme les dossiers, les bibliothèques ou n’importe quelle autre nomenclature que la plateforme de gestion de contenu d’entreprise particulière applique à ces emplacements.

Par conséquent, pour que les fonctions de tenue des enregistrements des plateformes de gestion de contenu d’entreprise fonctionnent, les professionnels de la gestion des documents et de l’information doivent collaborer de très près avec l’équipe des TI pour configurer la plateforme de gestion de contenu d’entreprise. Toutefois, comme c’est souvent le cas, l’équipe des TI n’est pas à l’aise avec l’idée que les professionnels de la gestion des documents et de l’information leur dictent comment configurer la plateforme de gestion de contenu d’entreprise. Pire encore, peu de professionnels de la gestion des documents et de l’information sont à l’aise avec la configuration de systèmes de gestion de contenu d’entreprise complexes. Ils doivent apprendre de A à Z une technologie et des compétences entièrement nouvelles. Les rares professionnels de la gestion des documents et de l’information qui s’y sont attaqués avec ardeur se sont souvent heurtés à une fin de non-recevoir par les grands services des TI, qui allaient de l’avant avec la configuration de la plateforme de gestion de contenu d’entreprise selon leur propre plan sans tenir compte des professionnels de la gestion des documents et de l’information. Dans de trop nombreux cas, les TI ont procédé à la gestion, à la planification, à la configuration et au déploiement de la plateforme de gestion de contenu d’entreprise selon leur propre vision et leur propre approche, sans égard ou presque aux fonctionnalités de tenue des enregistrements. Le mandat, le poids politique ou le savoir-faire faisaient souvent défaut aux professionnels de la gestion des documents et de l’information pour qu’ils puissent exercer une influence suffisamment grande sur le déploiement de la plateforme de gestion de contenu d’entreprise pour assurer un déploiement réussi de la fonction de tenue des enregistrements électroniques. D’après mon expérience, plus une organisation est grande, plus ce fossé a des chances d’être large. Les TI procèdent au déploiement de la plateforme de gestion de contenu d’entreprise et les professionnels de la gestion des documents et de l’information gèrent les documents sur papier, essayant en vain d’influencer l’orientation de la gestion de contenu d’entreprise.

À partir de 2012, bon nombre de fournisseurs de plateformes de gestion de contenu d’entreprise commençaient à éprouver un certain malaise avec la composante « tenue des enregistrements » de leurs projets. Ils percevaient la tenue des enregistrements comme compliquée – beaucoup de soucis en somme. Ils se disaient qu’il existait sûrement une solution plus simple qui ferait l’affaire.

À partir de 2012, la plupart des produits de gestion de contenu d’entreprise venaient avec la capacité d’appliquer des

« politiques » relatives aux documents. Une politique s’entend d’un ensemble de comportements et de caractéristiques qui peuvent être appliqués automatiquement à un document. Une politique peut être appliquée à un emplacement (par exemple à un dossier) de manière à ce que tous les documents dans cet emplacement soient automatiquement soumis à celle-ci. Ces politiques précisent la durée de rétention des documents, le moment de leur destruction et tous les autres critères qui doivent être respectés avant l’effacement du document. Parfois, ces politiques respectaient le calendrier de conservation, mais ne le devaient pas toujours – il était tout à fait possible de créer n’importe quelles politiques sans tenir compte du calendrier de conservation.

Cette nouvelle fonctionnalité est devenue assez populaire au sein des produits de gestion de contenu d’entreprise, parce qu’elle était considérée comme une façon pour les clients d’effacer les documents sans avoir à se préoccuper des difficultés et des frais inhérents à la tenue des enregistrements, qui la plupart du temps n’avançait pas. Plusieurs entreprises ont donc adopté ces politiques simples de suppression de documents en fonctions de critères imposées par les TI ou de critères provenant des unités d’exploitation, sans tenir compte du calendrier de conservation. Ils estimaient qu’après tout, c’était mieux que rien. Quant à moi, j’estimais au contraire que c’était une tendance inquiétante, une confirmation de plus que le secteur dans l’ensemble ne faisait aucun progrès en matière de déploiement de tenue des enregistrements électroniques.

Encore aujourd’hui, bon nombre de fournisseurs de systèmes de gestion de contenu d’entreprise offrent des capacités de tenue des enregistrements formelles, en plus d’offrir ces « capacités de politiques » génériques. J’ai entendu à plusieurs reprises des fournisseurs qualifier ces politiques de « version light » de tenue des enregistrements », ce qui me révolte. Étant donné que j’ai consacré toute ma carrière au développement de la technologie de tenue des enregistrements électroniques, j’ai été profondément troublé de constater qu’aucun déploiement de tenue des enregistrements électroniques ou presque n’avait obtenu de réel succès et ne respectait mes critères simples. J’ai donc quitté IBM pour fonder un cabinet-conseil indépendant des fournisseurs dont l’unique mission est d’aider les acheteurs à réussir le déploiement de leurs projets de tenue des enregistrements électroniques.

C’est dans ce contexte que Microsoft est entrée dans l’arène de la gestion de contenu d’entreprise en 2001, avec son logiciel SharePoint, que peu de fournisseurs établis de plateformes de gestion de contenu d’entreprise ont pris au sérieux. Quiconque dans le secteur du logiciel sait très bien toutefois qu’il ne faut pas prendre Microsoft à la légère (voir le cas de Michael Cowpland). En 2012, Microsoft avait lentement mais sûrement accaparé une part de plus en plus grande du marché des fournisseurs de gestion de contenu d’entreprise, en partant de la base. Les fournisseurs établis comme EMC, IBM, et OpenText devaient maintenant faire attention : SharePoint était en croissance et était en train de devenir un adversaire à prendre au sérieux dans le marché. À l’instar de tous ses concurrents du marché de la gestion de contenu d’entreprise, Microsoft en est rapidement venue à s’attaquer à la « gestion des enregistrements ».

 

Microsoft

La lente et inexorable pénétration sur le marché de SharePoint de Microsoft a atteint un point tel que ce logiciel est devenu une force avec laquelle il faut compter. Voici quelques chiffres que j’ai glanés ici et là sur le Web :

  • SharePoint représente pour Microsoft un marché de deux milliards de
  • Selon Microsoft, il y a chaque jour 20 000 nouveaux utilisateurs de
  • 80 % des entreprises du classement Fortune 500 utilisent
  • Selon Microsoft, le taux de pénétration du marché des entreprises est de 66 %.

Les chiffres présentés ci-dessus ne sont pas à jour (je n’ai pas réussi à trouver des chiffres plus récents), mais il ne fait aucun doute que SharePoint a une forte présence au sein de l’entreprise moderne. Par conséquent, de plus en plus de sites SharePoint ont recours, comme vous l’auriez deviné, au stockage de documents qui doivent être gérés comme des enregistrements.

J’ai déterminé que la fin du stade précédent était l’année 2012, et ce pour une bonne raison dont je vais traiter plus tard. Pour le moment, j’aimerais retourner en arrière, en 2006, lorsque l’aventure de Microsoft dans la tenue des enregistrements a commencé.

En 2006, Microsoft a commencé à déployer SharePoint au sein de sociétés et d’agences gouvernementales américaines de plus en plus grandes. À cette époque, cette plateforme de gestion de contenu d’entreprise était devenue un acteur sérieux et ravissait des parts de marché à IBM, à FileNet, à OpenText et aux autres en commençant par le bas de l’échelle du marché. Certains des clients les plus précieux de Microsoft lui ont lancé un ultimatum : développez un produit conforme aux normes de tenue des enregistrements ou nous laisserons tomber SharePoint au profit d’un concurrent.

Lors de la sortie de l’édition 2007 de SharePoint, Microsoft a annoncé en grande pompe l’ajout de fonctionnalités de tenue des enregistrements. Du jour au lendemain, le Web était inondé d’articles de Microsoft et d’experts de SharePoint de tous genres qui expliquaient au monde entier comment gérer les enregistrements avec

SharePoint 2007. Il était désormais possible de créer des politiques d’effacement automatique des documents conformément à un calendrier de conservation. J’ai donc entrepris d’analyser en profondeur cette nouvelle fonctionnalité et de l’évaluer, pour mon compte et pour celui de mes clients. J’ai été accablé par ce que j’ai constaté.

J’ai lu tout ce que j’ai pu glaner le Web et je me suis même rendu au siège social de Microsoft à Redmond, dans l’État de Washington, pour discuter avec de ces fonctionnalités. J’ai rencontré pratiquement tous les membres de l’équipe de 5 personnes affectée à la gestion des enregistrements de SharePoint (dont l’un était Montréalais, un grand fan des Canadiens de Montréal). C’était des gens fantastiques, mais je peux vous dire qu’ils n’avaient absolument aucune expérience dans la gestion des enregistrements. J’ai même eu l’insigne honneur de m’entretenir avec l’architecte en chef de SharePoint de Microsoft.

À mon avis? Ils ont manqué leur coup. Il n’y avait tout simplement aucune manière dans SharePoint de gérer un calendrier de conservation adéquat ou d’y importer des données. La gestion adéquate des fiches-mère était impossible et le processus de disposition, inexistant. SharePoint se débarrassait tout simplement des documents à mesure que l’échéance arrivait.

Microsoft a déployé l’artillerie lourde sur le front de la tenue des enregistrements. Les très nombreux partenaires et experts de SharePoint réorientaient et publiaient à nouveau le message de base de Microsoft sur la tenue des enregistrements. Encore aujourd’hui, j’observe qu’il existe des milliers de pages Web qui nous expliquent en détail comment gérer des enregistrements avec SharePoint. J’ai expliqué à Microsoft en quoi elle s’était trompée. En guise de réponse, on m’a poliment expliqué la raison pour laquelle j’étais de toute évidence celui qui avait tort. Microsoft a envoyé leurs gestionnaires des enregistrements en campagne afin qu’ils racontent la manière dont Microsoft gère ses propres enregistrements avec SharePoint et comment le monde entier pouvait maintenant faire de même.

Ce n’était peut-être qu’une impression, mais je me sentais comme un paria. Je prêchais dans le désert et je contredisais directement la toute puissante Microsoft. Comme j’ai pu le constater, Microsoft disposait d’alliés très nombreux, soit son réseau mondial de partenaires et de distributeurs, qui s’enrichissaient tous grâce à la vente de services liés à SharePoint. Et naturellement, quand je soutenais auprès des entreprises que SharePoint ne gérait pas les enregistrements, les partenaires et les distributeurs leur disaient tout le contraire. Ils ont bénéficié de l’appui systématique de Microsoft. Qui m’a appuyé dans mes démarches? Personne, naturellement. Les professionnels de la gestion des documents et de l’information du monde entier doutaient fortement des affirmations de Microsoft, mais personne ne semblait être vraiment en mesure d’expliquer sur ce qui clochait avec SharePoint.

En 2009, Microsoft a affirmé qu’elle avait pris connaissance des problèmes liés à SharePoint, et elle a annoncé, encore une fois en grande pompe, qu’elle avait apporté une foule d’améliorations aux fonctions de tenue des enregistrements : Enregistrements sur place! Nouvelles fonctions d’organisation de contenu! Nouveau centre de gestion des enregistrements! Politiques de conservation par étapes!

Cette annonce abordait la plupart des principaux points. À l’époque, un certain niveau de scepticisme régnait désormais dans le marché au sujet de la capacité de Microsoft à gérer les enregistrements. Et encore une fois, je suis passé dans les bureaux Microsoft pour découvrir par moi-même ce qui avait changé. Ma conclusion : pas assez, du moins rien qui ne permette de surmonter les (trois) défauts fatals d’origine. J’étais fatigué de devoir répondre sans cesse aux mêmes questions : qu’est-ce qui cloche donc avec SharePoint? Peu de gens me prenaient au sérieux.

Microsoft continuait à offrir la fonctionnalité de tenue des enregistrements dans SharePoint, et disons simplement que je n’étais pas particulièrement fan du réseau mondial de partenaires de Microsoft.

J’en avais assez et je me devais de faire quelque chose. Ma réputation et ma crédibilité commençaient à en pâtir. J’ai donc entrepris de rédiger un rapport détaillé dans lequel j’expose les faits concrets. J’étais bien conscient qu’on mettrait en doute les moindres détails. J’ai donc soigneusement fait mes recherches et validé l’information avec Microsoft. Le rapport énonçait clairement les défauts de SharePoint et expliquait en détail la manière de le configurer pour faire les choses correctement. J’ai demandé à Microsoft de passer mon rapport en revue et d’en vérifier l’exactitude, ce qu’ils ont fait. Microsoft m’a autorisé à utiliser une déclaration stipulant qu’elle avait passé en revue le rapport pour en vérifier l’exactitude factuelle. J’ai distribué mon rapport aux personnes intéressées. ARMA International (https://members.arma.org/eweb/home.aspx?site=ARMASTORE) l’a publié sous forme de livre. En un rien de temps, le rapport circulait dans le monde entier. Que s’est-il passé à la suite de la publication? Rien, ce qui était à mon avis la meilleure issue possible. Le rapport n’a jamais été remis en question. Avec les années, et pas seulement grâce à mon livre, de plus en plus de gens ont réalisé que Microsoft s’était peut-être trompée. Les relations entre Microsoft et moi s’étaient même quelque peu réchauffées : j’ai été invité à joindre la tournée pancanadienne de promotion de SharePoint. J’ai accepté leur offre d’expliquer comment gérer les enregistrements avec SharePoint. Cette aventure n’a duré qu’une session. Microsoft m’a viré en raison de mon attitude supposément trop négative à l’égard de la tenue des enregistrements. J’étais de nouveau devenu un paria…

Revenons maintenant à 2012, année où commence le troisième stade. Certains fournisseurs éclairés de logiciels du secteur de la tenue des enregistrements ont vu un créneau de marché émerger en raison des difficultés que connaissait Microsoft dans la tenue des enregistrements. Un tout nouveau segment de marché était né : les logiciels compagnon (add-in) de tenue des enregistrements pour Microsoft SharePoint. En date du présent rapport, il n’existe que quatre fournisseurs qui offrent ce genre de solutions : deux aux États-Unis, un en Australie et un au Canada. Un seul des quatre fournisseurs détenait la certification à la norme 5015.02 et deux autres fournisseurs s’étaient engagés à l’obtenir.

Les acheteurs de SharePoint étaient finalement en mesure d’obtenir une véritable tenue des enregistrements. Les quatre fournisseurs de logiciels compagnon ont chacun adopté une approche radicalement différente pour parvenir à mettre en œuvre la tenue des enregistrements dans SharePoint, et ils y sont malgré tout tout parvenu, après un certain nombre de faux départs pour quelques fournisseurs. Qu’arriverait-il si Microsoft réussissait un jour à intégrer la fonction de tenue des enregistrements en tant que fonctionnalité native de SharePoint? Cela aurait sans doute pour effet de détruire le marché embryonnaire des logiciels compagnon de tenue des enregistrements. Je doute que cela se produise un jour. Microsoft soutient que tant qu’il existe un marché sain pour les logiciels compagnon de tenue des enregistrements, elle n’a aucun intérêt à plonger à nouveau dans le marché de la tenue des enregistrements. L’entreprise préfère plutôt mettre l’accent sur sa plateforme et encourager l’émergence d’un écosystème de produits et de services de tiers qui utilisent la plateforme pour offrir des solutions aux clients. Bref, tant que les gens achètent SharePoint, ça fait l’affaire de Microsoft. Si Microsoft est satisfaite et que les fournisseurs de logiciels compagnon de tenue des enregistrements le sont également, espérons que ce sera également le cas des acheteurs de SharePoint! De mon point de vue, jusqu’ici, tout va bien. Tous les deux ans, un nouveau fournisseur de logiciels compagnon de tenue des enregistrements arrive sur le marché et j’attends avec impatience l’arrivée de chaque nouveau fournisseur.

Dans ce segment de marché, qui n’est constitué que de quatre fournisseurs relativement petits, rien ne se fait de la même manière que dans le reste du marché de la gestion de contenu d’entreprise (les fournisseurs autres que Microsoft). Le rythme d’innovation est ahurissant. Les fournisseurs de logiciels compagnon de tenue des enregistrements n’ont pas à se soucier de l’architecture d’une plateforme de gestion de contenu d’entreprise. Ils peuvent donc consacrer toute leur énergie à la gestion des enregistrements, laissant la difficile gestion de contenu d’entreprise à Microsoft. Ils n’ont pas non plus à se préoccuper d’un dépôt, de la création de métadonnées ou même de la fonction de recherche; Microsoft fait déjà tout cela à leur place. Ils peuvent donc en toute liberté trouver de nouvelles façons innovantes d’appliquer la tenue des enregistrements aux documents. Les concurrents de la plateforme de gestion de contenu d’entreprise de Microsoft de leur côté, allouent relativement peu de ressources aux fonctionnalités de tenue des enregistrements. À l’inverse, les fournisseurs de logiciels compagnon de tenue des enregistrements ont mis sur pied des équipes de développement, de marketing et de soutien qui se consacrent exclusivement à la tenue des enregistrements.

Il existe également une autre différence fondamentale. La plupart des produits de gestion de contenu d’entreprise traditionnels sont ce que j’appelle fondés sur l’emplacement, en ce sens que les actions et les caractéristiques des documents sont déterminées par leur emplacement (dossier, librairie, etc.). C’est donc l’emplacement qui compte, et les utilisateurs doivent constamment se soucier de l’endroit où les fichiers sont stockés. SharePoint chamboule complètement cette logique, avec une approche indépendante de l’emplacement, selon laquelle l’emplacement n’a pas d’importance. Au cours des années, l’une des objections les plus pertinentes dont m’ont fait part les utilisateurs finaux était leur aversion à se faire dicter l’emplacement où ils doivent stocker leurs documents. Jusqu’à maintenant, la majeure partie (pas toutes) des solutions de gestion de contenu d’entreprise dictaient aux utilisateurs l’emplacement où ils doivent stocker les documents pour que la tenue des enregistrements puisse se faire correctement.

Et voici où la magie a vraiment fait son œuvre : les quatre fournisseurs ont finalement offert des fonctionnalités qui permettaient une véritable automatisation de la tenue des enregistrements. La tenue des enregistrements basée sur des règles est une capacité logicielle qui permet aux professionnels de la gestion des documents et de l’information d’automatiser entièrement les éléments suivants :

  • Déterminer quels documents sont des enregistrements;
  • Décider du moment où déclarer les documents comme enregistrements;
  • Classer les documents selon le calendrier de conservation (correctement cette fois!);
  • Décider du moment où classer les enregistrements dans une archive à long

Voilà l’avancée que j’attendais depuis 30 ans! Depuis trois décennies, nous dépendions des utilisateurs pour identifier quels documents sont des enregistrements et pour les classer selon un calendrier de conservation qui ne les intéresse pas et qu’ils ne comprennent pas. C’est cette dépendance envers les utilisateurs qui a freiné cette technologie depuis le tout début. À mon avis, tout cela est maintenant de l’histoire ancienne. Pour être équitable envers les fournisseurs autres que ceux liés à SharePoint, il faut dire que certains produits de gestion de contenu d’entreprise traditionnels ont intégré un certain niveau d’automatisation, mais je n’ai jamais rien vu qui s’approchait du niveau d’automatisation du segment de marché de SharePoint.

Vous pourriez en conclure que je fais davantage la promotion des solutions de SharePoint que des autres produits de gestion de contenu d’entreprise, mais vous auriez tort, une fois de plus. Les fournisseurs de plateformes de gestion de contenu d’entreprise traditionnelles disposeront toujours d’une position forte sur le marché. Microsoft est la première à admettre que son produit ne peut satisfaire les besoins de tout le monde. J’ai des clients qui ont des exigences de gestion de contenu d’entreprise que SharePoint ne peut pas respecter, et je doute que SharePoint soit un jour en mesure de le faire. L’un de mes clients utilise Documentum pour gérer des millions de dossiers de maintenance et de dessins en aéronautique – je doute fort qu’il passe à SharePoint de mon vivant! À l’heure actuelle, on observe que l’utilisation de SharePoint augmente sans cesse au sein des organisations où sont installées des plateformes de gestion de contenu d’entreprise traditionnelles autres que celle de Microsoft. Certains passeront inévitablement à SharePoint. D’autres l’ignoreront et poursuivront leurs activités sans problème. Certains essaieront de faire fonctionner ensemble les deux systèmes – SharePoint pour la création de documents, et la plateforme de gestion de contenu d’entreprise traditionnelle comme dépôt formel des enregistrements. C’est techniquement possible, mais c’est assez difficile à mettre en place et à gérer.

Tous les fournisseurs de plateformes de gestion de contenu d’entreprise autres que Microsoft ont les moyens d’offrir de bonnes fonctionnalités de tenue des enregistrements dans leur gamme de solutions. Seulement, plus de temps et de travail sont nécessaires pour atteindre cet objectif. Absolument rien n’empêche ces fournisseurs de plateforme de gestion de contenu d’entreprise de mettre en place les mêmes capacités d’automatisation de tenue des enregistrements basée sur des règles dans leurs produits, et je les y encourage fortement.

 

Où en sommes-nous?

Les systèmes de gestion de contenu d’entreprise qui offrent des fonctionnalités de tenue des enregistrements électroniques sont de nos jours mieux connus sur le marché sous le nom de SGEDD (Système de Gestion Électronique des Documents et des Dossiers). C’est donc le terme que j’utiliserai à compter de maintenant.

À l’heure actuelle, la technologie SGEDD peut être classée en trois « groupes » distincts sur le marché.

Plateformes traditionnelles de gestion de contenu d’entreprise  – Il s’agit des plateformes des fournisseurs importants de gestion de contenu d’entreprise auxquels j’ai fait référence (IBM, OpenText, etc.). Afin d’obtenir des fonctionnalités de SGEDD, il faut d’abord investir dans une plateforme de gestion de contenu d’entreprise de pointe et ensuite utiliser la composante « tenue des enregistrements » de la plateforme. En date de ce rapport, ces fournisseurs n’ont pas autant innové dans la tenue des enregistrements basée sur des règles que les fournisseurs de logiciels compagnon pour SharePoint, mais la situation pourrait changer rapidement.

SharePoint – L’achat d’un logiciel compagnon de tiers est nécessaire pour compléter SharePoint. C’est en soi un inconvénient, tous les professionnels des TI vous diront que l’intégration de produits séparés n’est jamais la meilleure option technologique. Toutefois, cette situation permet à la technologie de logiciels compagnon de tenue des enregistrements d’être plus ciblée et innovante. En prime, la tenue des enregistrements basée sur des règles permet d’automatiser la majeure partie des fonctions de tenue des enregistrements de manière à ne plus dépendre des utilisateurs finaux pour atteindre les objectifs de rendement en matière de tenue des enregistrements.

Systèmes indépendants – Les entreprises de logiciels spécialisées en tenue des enregistrements qui ciblent principalement les professionnels de la gestion des documents et de l’information et qui ont élargi leurs produits pour inclure des fonctions de gestion de contenu d’entreprise sont peu nombreuses (je ne peux en nommer que trois, peut-être quatre). Leurs produits sont centrés sur les enregistrements, ce qui signifie que les enregistrements sont au cœur de leurs produits, qui sont destinés au marché de la tenue des enregistrements. J’inclus dans ce groupe HP, avec son offre HP RM. Bien que la configuration technologie de HP RM soit semblable à celle des deux autres dans ce groupe, l’entreprise à l’origine du produit est assez différente. HP est une entreprise d’envergure mondiale, alors que les deux autres sont des acteurs occupant un créneau spécialisé et sont donc de petite taille.

 

Ces trois entreprises ont toutes leur place sur le marché, et elles continueront de fournir des solutions qui fonctionnent. Je n’ai pratiquement jamais vu une organisation choisir une technologie de gestion de contenu d’entreprise uniquement en fonction des exigences de tenue des enregistrements. Le plus souvent, le service des TI choisira soit SharePoint soit une technologie de gestion de contenu d’entreprise traditionnelle, et s’en tiendra à elle. Les fonctionnalités de tenue des enregistrements dépendent alors de ce choix technologique. Par exemple, l’entreprise qui fait le choix d’OpenText devra obligatoirement utiliser les fonctions de tenue des enregistrements d’OpenText. Faire le choix de SharePoint, en revanche, c’est pouvoir choisir entre quatre logiciels compagnon de tenue des enregistrements.

Je n’aime pas trop les solutions fournies par les systèmes indépendants que j’ai pu observer jusqu’à présent. J’ai été témoin de gens qui ont connu de grandes difficultés à faire de l’adoption par les utilisateurs finaux un succès, surtout en ce qui concerne les déploiements d’entreprises de petite et moyenne tailles (moins de 1 000 utilisateurs). Ces produits conviendront probablement mieux aux entreprises dont la culture est fortement orientée sur la tenue d’enregistrements. Je crois que plus de travail sera nécessaire pour assurer le succès de l’adoption par les utilisateurs finaux que dans les autres groupes.

Vous souvenez-vous des quatre problèmes que j’ai mentionnés précédemment qui ont plombé les projets de SGEDD jusqu’à maintenant? Eh bien, ces problèmes existent toujours. Mais alors, qu’est-ce qui a changé? C’est la technologie qui a changé, particulièrement la technologie liée à SharePoint. Grâce à la tenue des enregistrements basée sur des règles, les chances de réussite sont bien meilleures. La faible priorité accordée à la gestion des enregistrements par rapport à la plateforme de gestion de contenu d’entreprise n’est désormais plus un obstacle aussi difficile à surmonter qu’auparavant. Ce n’est pas aussi difficile que de former et d’outiller les professionnels de la gestion des documents et de l’information. Quant aux deux autres facteurs, ce sont toujours les mêmes. Le déploiement réussi d’une plateforme de gestion de contenu d’entreprise est déjà assez difficile en lui-même. Si les professionnels de la gestion des documents et de l’information et les TI ne collaborent pas, la tenue des enregistrements est encore et toujours vouée à l’échec.

 

Courriels

D’après mes estimations, les courriels représentent de 30 % à 80 % de tous les enregistrements numériques d’une organisation, et ils doivent tous être contrôlés et gérés adéquatement comme des enregistrements. Autrement dit, il y a de trois à cinq fois plus de courriels que de documents qui répondent aux critères d’un enregistrement d’affaires, et ces courriels devraient et doivent être gérés comme des enregistrements. Aucune entreprise ne peut prétendre gérer ses enregistrements électroniques si les courriels ne font pas l’objet d’un processus de contrôle des enregistrements.

Sur le plan technologique, le courriel demeure le talon d’Achille de tout projet moderne de SGEDD. Le problème réside dans le fait que la plupart des organisations utilisent la plateforme Outlook/Exchange pour leurs courriels. Cette plateforme de messagerie n’est dotée d’absolument aucune fonctionnalité de tenue des enregistrements. Elle est comme un îlot de volumes gigantesques de données stockés qui est en tous points déconnecté de la plateforme de gestion de contenu d’entreprise. Même la plateforme de gestion de contenu d’entreprise de Microsoft n’intègre pas les courriels. Dans le marché des plateformes de gestion de contenu d’entreprise, chaque fournisseur doit écrire des intégrations spécifiques entre leurs produits et Exchange afin que leurs utilisateurs puissent facilement intégrer leurs courriels à la plateforme de gestion de contenu d’entreprise pour qu’ils puissant être gérés comme des enregistrements. Dans le cas de SharePoint, encore un autre produit de tiers est nécessaire pour intégrer les courriels au logiciel. En effet, pour toute solution de SGEDD dans SharePoint, il faut faire l’acquisition de trois technologies différentes : SharePoint, le logiciel compagnon de tenue des enregistrements et un produit d’intégration des courriels.

Même si les courriels sont étroitement intégrés à la plateforme de gestion de contenu d’entreprise de manière à ce que les utilisateurs puissent facilement les incorporer dans la plateforme, le choix des courriels à intégrer demeure la décision de l’utilisateur final. Pour un utilisateur final, cela implique de décider quels courriels sont importants pour l’organisation. Ensuite, il doit suivre le processus de soumission des courriels dans la plateforme de gestion de contenu d’entreprise, ce qui implique de remplir les métadonnées obligatoires qui indiquent sur quoi porte le courriel. Tout cela vient s’ajouter à l’effort demandé à l’utilisateur final, ce qui nous ramène en arrière, dans les sombres années où nous dépendions du jugement et des efforts des utilisateurs. Nous en connaissons tous le résultat!

Il existe quelques nouvelles solutions technologiques innovantes (je ne peux en nommer qu’une avec certitude, peut- être deux) qui ont recours à des logiciels utilisant l’intelligence artificielle ou une technologie semblable pour lire les courriels de la boîte de réception et déterminer lesquels sont des enregistrements. Ce type de logiciel classe tous les courriels par probabilité qu’ils soient des enregistrements et dicte la manière dont ils doivent être classés. Est-ce que cette technologie fonctionne? Plus ou moins. Dans certains cas et avec certains types de courriels (courriels prévisibles, bien décrits), cette technologie fonctionne très bien. Dans d’autres cas, le résultat est vraiment médiocre. Dans l’ensemble, cette technologie est prometteuse, mais nous sommes encore loin d’une facilité d’utilisation courante. Je remarque également que les coûts et les frais généraux liés à cette technologie qui sont nécessaires pour soutenir ces fonctionnalités folles sont exorbitants. Je suis donc d’avis que ces fonctionnalités ne conviennent qu’à des projets bien financés et de grande envergure dotés de ressources considérables.

 

L’avenir de la tenue des enregistrements électroniques

Lorsque j’envisage l’avenir de la tenue des enregistrements électroniques, je pourrais facilement tenter de prédire un grand nombre de tendances technologiques. C’est un exercice amusant après tout. Je pourrais prédire que nous serons un jour pratiquement tous dans le nuage. Je peux certainement prédire que les fournisseurs de plateformes de gestion de contenu d’entreprise rattraperont les fournisseurs de logiciels compagnon pour SharePoint et offriront des fonctionnalités de tenue des enregistrements basée sur des règles. Mais ces prédictions ne nous aideront pas.

D’ailleurs, ce n’est pas ce que je vois dans ma boule de cristal. Ce que j’envisage pour l’avenir, c’est la chose qui m’obsède le plus ces temps-ci : l’éducation.

C’est sur l’éducation qu’il faut mettre l’accent. Beaucoup plus. Les meilleures technologies du monde ne nous serviront à rien si nous ne savons pas comment les utiliser pour atteindre nos objectifs. Nous disposons d’un grand nombre de technologies, et la technologie en est maintenant au point où nous pouvons dans une large mesure automatiser les processus de tenue des enregistrements. Dans le passé, c’est la technologie nous freinait, ce qui n’est plus le cas aujourd’hui. Désormais, ce qui nous freine, c’est nous-mêmes. Nous devons mieux aider les professionnels de la gestion des documents et de l’information à acquérir les connaissances et les compétences nécessaires pour comprendre la gestion de contenu d’entreprise à l’échelle où nous pouvons avoir une influence sur la manière dont la plateforme est configurée en vue de son déploiement. Nous devons mieux comprendre les logiciels modernes de SGEDD afin d’apprendre à faire de l’automatisation au moyen de la tenue des enregistrements basée sur des règles. Et finalement, nous devons mieux comprendre les méthodes et les techniques de gestion des projets de SGEDD, comme l’établissement de mesures de performance clé afin d’assurer le bon fonctionnement des projets.

J’ai récemment laissé entendre à une grosse firme mondiale de gestion des enregistrements que leurs projets de SGEDD n’étaient pas un grand succès et qu’un programme de formation de grande envergure était nécessaire pour renverser la situation. Et quelle a été sa réaction? « Nos clients se débrouillent très bien. Vous ne savez pas de quoi vous parlez. »

Nous possédons maintenant la technologie pour nous permettre de réussir. Pensez à un avion de ligne moderne en attente sur la piste. Sans pilote détenant la formation et les aptitudes nécessaires pour le faire voler, cette merveille de technologie n’est rien d’autre qu’un appareil coûteux et inutile. De nos jours, dans le domaine des SGEDD, beaucoup trop d’avions restent cloués au sol. Alors, qu’attendons-nous pour prendre notre envol?

 

1N.d.T. : Dans le présent document, le mot enregistrement est utilisé pour traduire le mot « record », l’expression gestion des enregistrements traduit « records management » et l’expression tenue des enregistrements traduit « recordkeeping ».
2Au moment de la rédaction du présent document, le SGDDI est désormais connu sous le nom de GCDocs.