Categories: Sagesse

Sagesse White Papers

Are You Stuck in the Past with your RIM Program and Software?
Jasmine Boucher, BMgt, CIP

Historic and current challenges faced by records managers for implementing software, process and technology solutions are discussed. Learn more about discovering the problems, how to promote RIM as a vital resource worth investing organizational resources into, and the next steps in implementing a records management program.

Integrate Digital Preservation into your Information Governance Program
Lori J. Ashley

An exponentially increasing number of valued information assets are born and will spend their entire “lives” in the digital realm. For those assets that must be retained long-term, there are risks and serious threats from technology obsolescence that should be systematically considered and proactively addressed.

MIGRATING a Billion-Dollar Government Agency TO A NEW RECORDS SYSTEM
Jas Shukla

ECM projects fail for different reasons. In our experience, we see failures most commonly due to legacy technology that is not supported, lack of leadership, lack of adoption due to a poor user experience, or a poor rollout process such as a ‘big bang’ approach. 

This article takes an in-depth look at a successful ECM project with a provincial government agency and details how that project was rolled out over a period of 18 months.

Bring Order to Chaos: Regaining Control of Unstructured Data
Jacques Sauve

Organizations have been accumulating data for years, sometimes decades. Corporate networks are storing literally millions of files, folders, and emails. As the demand for more storage grew over the years, there were two options to consider:

  1. Clean up what was no longer needed and reclaim the space,
  2. Add more storage capacity to the network systems.
Choosing a software – Getting to the Request for Proposal (RFP)
Brenda Prowse, CRM
Director of Professional Service & IM, Prima Information Solutions

Quick fix? Magic Bullet? These are terms often used when organizations talk about purchasing software. Many seem to think a piece of software will solve all of their information management problems. To be honest, it can solve most. But not without some hard work first. So no, you can’t just say Abra Cadabra and all the problems will go away like magic!

Sagesse: 2020 Edition Volume V

Sagesse 2020, Volume V, Issue I

1-Introduction  [web link] [pdf]

2- Achieving Perfect PITCH  [web link] [pdf]

3- Our Phygital World and Information Management [web link] [pdf]

4- Preserving the Legacy of Residential Schooling Through a Rawlsian Framework [web link] [pdf]

5- Préserver l’héritage des pensionnats indiens en s’appuyant sur les principes de Rawls [web link] [pdf]

6- The Transformational Impacts of Big Data and Analytics [web link] [pdf]

7- Les effets transformateurs des mégadonnées et del’analytique [web link] [pdf]

8- Tribute to Leonora Casey [web link] [pdf]

9- Smart wearables and Canadian Privacy: Consumer concerns and participation in the ecosystem of the Internet of Things (IoT) [web link] [pdf]

10- In memory of Ivan Saunders… [web link] [pdf]

 

White Papers 2020

Whitepaper– Integrate Digital Preservation into Your Information Governance Program [web link] [pdf]

Whitepaper– Case Study: Migrating a Billion-Dollar Government Agency to a New Records System [web link] [pdf]

Introduction

 

Estimated reading time: 6 minutes. Contains 1200 words

 

Welcome to ARMA Canada’s fifth issue of Sagesse: Journal of Canadian Records and Information Management, an ARMA Canada publication.

Sagesse’s Editorial Team

Sagesse’s Editorial Board welcomes Barbara Bellamy, CRM, as ARMA Canada’s Director of Canadian Content, effective July 1, 2020. Barbara has a wealth of ARMA experience, at the Chapter level with her volunteer work on the ARMA Calgary Chapter as well her contributions to ARMA International’s Education Foundation (AIEF). She co-authored an article that appeared in Sagesse’s 2019 edition (see “Email Policy and Adoption”) and is a regular speaker at ARMA Canada conferences.  Barbara has already taken a leading role with moving Sagesse forward as you will see in reading this edition. She is a most valuable addition to the team. Welcome Barbara.  

We also welcome Pat Burns, CRM, to the Sagesse team. Pat also has an extensive resume with ARMA Canada, ARMA International and the ARMA New Brunswick Chapter. She was Chapter President for many years, was ARMA Canada’s Region Director for 4 years and then spent valuable time with ARMA International and the Institute of Certified Records Managers. We are thrilled to work with Pat and have her and her expertise on our team!

Congratulations are extended to Sagesse’s editorial board member Stuart Rennie who received ARMA International’s Company of Fellows distinction (Fellow of ARMA International or FAI) at ARMA International’s conference in 2019 at Nashville. Stuart is the 6th FAI in Canada.

Read More

Sagesse: 2019 Edition Volume IV

Winter 2019 Sagesse Publication

  1. 2019 Introduction to Sagesse.
  2. Ethical Concerns in the Provision of eHealth Services. By Meagan Collins, Nicole Doro, Ben Robinson, & Ariel Stables-Kennedy. Recipients of $1,000 in the 2018 ARMA Canada Essay Contest.
  3. IT & Modern Public Service: How to Avoid IT Project Failure and Promote Success. By Scarlett Kelly. Recipient of $600 in the 2018 ARMA Canada Essay Contest.
  4. Defending Traditional Rights in the Digital Age: Ktunaxa Nation Case Study. by Michelle Barroca, BA, MAS.
  5. Politique relative aux messages électroniques et son adoption. Par Barb Bellamy, CRM et Anne Rathbone, CRM.
  6. Email Policy and Adoption. by Barb Bellamy, CRM & Anne Rathbone, CRM.
  7. Thank you to John Bolton! by Alexandra Bradley, CRM, FAI
  8. Are we There Yet? Introduction to the SharePoint & Collabware Implementation at the First Nations Summit. by Sandra Dunkin, CRM, IGP & Sam Hoff
  9. Quand est-ce qu’on arrive? Introduction à l’implémentation de SharePoint & Collabware au First Nations Summit Society. Par Sandra Dunkin, CRM, IGP & Sam Hoff
  10. A Record of Service: The First Canadian Conference on Records Management “A Giant Success.” by Sue Rock, CRM

Biographies

Michelle Barroca (BA, MAS) is an independent recorded information management consultant from the Kootenay region of BC. Michelle graduated from UBC’s School of Library, Archival, and Information Studies with a Master of Archival Studies degree in 2000.  Shortly after university, she was hired as the City of Burnaby’s first records manager and archivist, and later worked at the City of Kelowna as the Records and Information Coordinator. For nearly a decade, Michelle has operated FY Information Management Consulting and provides services to various local governments and First Nations organizations, including the Ktunaxa Nation Council.

Barb Bellamy, CRM,has over 30 years of experience in Records and Information Management for the Utility and Energy industries. She currently specializes in Information Management Consulting and Acquisition and Divestment work within Suncor. Barbara is a Certified Records Manager and holds a bachelor’s degree in Business Administration.

Alexandra (Sandie) Bradley, CRM, FAI has been a records and information manager for over 40 years, and a member of ARMA for 35 years. Through her chapter, regional and international roles within ARMA, she is a mentor and teacher, researcher, writer and advocate for our profession. She is a Certified Records Manager and was made a Fellow of ARMA International (Number 47) in 2012. She is a member of the Vancouver Chapter, is currently a member of the Sagesse Editorial Board and also the Content Editorial Board for ARMA International.

Megan Collins holds her MA and Honours BA from Brock University in Political Science and is presently a MLIS student at Western University. She is currently completing a co-op posting working as an information management officer with the Federal Government. Her research interests include information policy, knowledge management, Big Data, privacy and surveillance.

Nicole Doro holds her MA and Honours BA in English and Critical Theory from McMaster University, and is presently a MLIS student at Western University. She is currently completing an Instructional and Research co-op at McMaster University. Her research interests include open access, institutional repositories, as well as privacy and surveillance.

Sandra Dunkin (MLIS, CRM, IGP) is the Records & Information Management Coordinator at the First Nations Summit (FNS) where she also has responsibility for IT. She is the project manager for the SharePoint implementation at the FNS and is also currently the out-going Chair of the Information Governance Advisory Committee for the First Nations Public Service Secretariat.

Sam Hoff is a Senior Consultant at Gravity Union. He is passionate about providing user experience focused solutions to meet document and records management needs. Having backgrounds in both the technical and human sides of technology and business, he has been focused on Enterprise Content Management and Records Management solutions based on SharePoint for the past seven years.

Scarlett Kelly holds both Master of Public Administration (MPA) and Master of Library and Information Studies (MLIS). Her research focuses on IT project management, strategic planning, information/data policy, and change management. Currently a Research Specialist at Canada Revenue Agency, Scarlett is gaining valuable experience in legal and business research.

Anne Rathbone has 22 years experience in the public sector establishing, implementing and maintaining records and information management programs. She earned her Certified Records Manager (CRM) designation in 2010. Currently, Anne is the Records Management Coordinator for the Sunshine Coast Regional District.

Ben Robinson is a poet, musician and librarian. In October 2017, Bird, Buried Press published his first poetry chapbook “Mayami.” In 2018 he was named the Emerging Artist in the Writing category at the Hamilton Arts Awards. He is currently preparing an article for publication on policing and security in the public library.

Sue Rock, CRM is a records management consultant with a diverse workload including records management assessments, records audits, classification systems and retention schedules. She ghost writes policies, procedures, memos, letters, debates, rebuttals, opinions, guidelines, action checklists and presentations. Sue’s educational background comprises a Bachelor of Arts, Université d’Ottawa, and a Professional Specialization Certificate in International Intellectual Property Law co-sponsored by both the University of Victoria, Canada and St. Peter’s College Oxford, England. She is a Past President, ARMA
Calgary Chapter, and a Chapter Member of the Year award winner. Her publications include “A Record of Service: Bob Morin, CRM” ARMA Canada Sagesse, “RIM’s Role in the Technology Lifecycle”, ARMA International, “Managing Privacy Conflicts Across Borders – Vendor Awareness and Action”, Security Shredding & Storage News, and team lead for ARMA International Guideline “Records Management for Information Technology Professionals.” Sue is owner of The Rockfiles Inc. and an operating partner of Trepanier Rock ™, with specific focus on ensuring records management principles are embedded
into information management technology solutions.

Ariel Stables-Kennedy holds a BSc in Kinesiology from the University of Waterloo and is currently completing her MLIS at Western University in London, Ontario. Her research interests include instructional design, the information literacy learning needs of STEM students, and information ethics.

Sagesse: Journal of Canadian Records and Information Management an ARMA Canada Publication Winter, 2019 Volume IV, Issue I  

 

Introduction

Welcome to ARMA Canada’s fourth issue of Sagesse: Journal of Canadian Records and Information Management, an ARMA Canada publication.

Sagesse Editorial Team

The Sagesse Editorial Board would like to take this opportunity to gratefully acknowledge and thank John Bolton who was a founding member of Sagesse: Journal of Canadian Records and Information Management. John resigned mid 2018 after making many invaluable contributions to this publication. His impact on Sagesse and the RIM and IG community are significant. Sandie Bradley has captured some of his brilliance in an article titled “Thank you to John Bolton!” in this issue.

The Editorial Board would like to welcome Sandra Dunkin, MLIS, CRM, IGP to the editorial team. Sandra brings a wealth of ARMA and workplace experience, from her work on the ARMA Vancouver Chapter, the ARMA Canada team as Program Director, to her work at First Nations Summit. Sandra has already demonstrated her commitment and dedication to the Sagesse team performance. She also has an article in this edition – see “Are We There Yet?”

2018 University Contest

There’s a new addition to Sagesse’s 2019 edition and we are excited to share that in 2018, ARMA Canada and the Sagesse Editorial Board held an essay contest for graduate students enrolled in graduate information management programs in a number of Canadian universities. The article that best met the theme, Canadian Issues for Information Management, received $1,000 while the runner up received $600.

We are honoured to announce that the team of: Meagan Collins, Nicole Doro, Ben Robinson and Ariel Stables-Kennedy from the University of Western Ontario were the $1,000 recipients with their article “Privacy and Ethical Concerns in the Provision of eHealth Services.” Their article explores an ethical analysis of data collection policies of Canadian live eHealth service providers and comments to what extent they conform to the Canadian Medical Association’s Code of Ethics.

Scarlett Kelly, from Dalhousie University, was the $600 recipient with her article “Information Technology and Modern Public Service: How to Avoid IT Project Failure and Promote Success.” Scarlett looks at a global scan that indicates that IT projects may have as high as an 85% failure rate discussing it from the point of view of the people involved, IT processes and the product to deliver to users.

Congratulations to our students!

To our readers, prepare to be captivated with the content, writing and research in these articles as well as the others in this edition.

2019 Sagesse Articles

Continuing with Sagesse’s tradition of providing thought provoking content, this 2019 issue features the following articles:

  • Michelle Barroca, BA, MAS, provides a case study in her article “Defending Traditional Rights in the Digital Age: Ktunaxa Nation Case Study,” of Ktunaxa Nation’s experience with using business and cultural records to legally defend traditional rights in opposing permanent development in a culturally significant and ecologically sensitive area in British Columbia’s southeast quadrant.
  • Barb Bellamy, CRM and Anne Rathbone, CRM, discuss the challenges an organization faces in establishing an email policy and the compliance issues the employees need to embrace in “Email Policy and Adoption.” They provide an excellent array of tools to assist this process by using catch phrases, branding and gamification. This paper has been translated into French.
  • Alexandra (Sandie) Bradley, CRM, FAI, features an article that pays tribute to and provides the biography of John Bolton, CRM. John was a founding member of Sagesse: Journal of Canadian Records and Information Management, and his wisdom was a guiding light while a member of this team. In “Thank you to John Bolton,” she describes his RIM journey and the many accomplishments he achieved in his
  • “Are We There Yet?” an article by Sandra Dunkin, MLIS, CRM, IGP and Sam Hoff, ECM Consultant, explores a SharePoint implementation project conducted at the First Nations Summit while underscoring the lessons learned, missteps, failures and eventual successes. Also addressed are the challenges of change management, user training and adoption and various upgrades as users engage with SharePoint. This paper has also been translated into
  • Who doesn’t enjoy an ARMA Canada Conference? Sue Rock’s, CRM, article “A Record of Service: The First Canadian Conference on Records Management “A Giant Success,” looks back at the very first ARMA Canada conference, held in beautiful Banff, Alberta in February 1980. Discover how some of ARMA Canada traditions were developed and have evolved over the years.

Please note the disclaimer at the end of this Introduction stating the opinions expressed by the authors in this publication are not the opinions of ARMA Canada or the editorial committee. We are interested in hearing whether you agree or not with this content or have other thoughts or recommendations about the publication. Please share and forward to: sagesse@armacanada.org

If you are interested in providing an article for Sagesse or wish to obtain more information on writing for Sagesse, visit the ARMA Canada’s website – www.armacanada.org – see Sagesse.

Enjoy!

 

ARMA Canada’s Sagesse’s Editorial Review Committee:

Christine Ardern, CRM, FAI, IGP; Alexandra (Sandie) Bradley, CRM, FAI; Sandra Dunkin, MLIS, CRM, IGP; Stuart Rennie, JD, MLIS, BA (Hons.); Uta Fox, CRM, FAI, Director of Canadian Content.

 

Disclaimer

The contents of material published on the ARMA Canada website are for general information purposes only and are not intended to provide legal advice or opinion of any kind. The contents of this publication should not be relied upon. The contents of this publication should not be seen as a substitute for obtaining competent legal counsel or advice or other professional advice. If legal advice or counsel or other professional advice is required, the services of a competent professional person should be sought.

While ARMA Canada has made reasonable efforts to ensure that the contents of this publication are accurate, ARMA Canada does not warrant or guarantee the accuracy, currency or completeness of the contents of this publication. Opinions of authors of material published on the ARMA Canada website are not an endorsement by ARMA Canada or ARMA International and do not necessarily reflect the opinion or policy of ARMA Canada or ARMA International.

ARMA Canada expressly disclaims all representations, warranties, conditions and endorsements. In no event shall ARMA Canada, its directors, agents, consultants or employees be liable for any loss, damages or costs whatsoever, including (without limiting the generality of the foregoing) any direct, indirect, punitive, special, exemplary or consequential damages arising from, or in connection to, any use of any of the contents of this publication.

Material published on the ARMA Canada website may contain links to other websites. These links to other websites are not under the control of ARMA Canada and are merely provided solely for the convenience of users. ARMA Canada assumes no responsibility or guarantee for the accuracy or legality of material published on these other websites. ARMA Canada does not endorse these other websites or the material published there.

Privacy and Ethical Concerns in the Provision of eHealth Services

by Meagan Collins, Nicole Doro, Ben Robinson, and Ariel Stables-Kennedy

 

Introduction

The development of the internet and the advancement of digital technologies have resulted in life-saving access to information, especially within the medical industry. Whereas, before the introduction of these technologies, individuals would need to visit a doctor’s office for their health-related questions, they can now consult resources themselves on the internet. A study showed that 83% of internet users around the world have searched for health information1. In 2012, the number of Canadians who regularly used the internet to search for medical and health- related information at home was over 65%, making it the sixth most common use of the internet in Canada2. Along with increased access to medical information, Information Communication Technology (ICT) is also an integral tool in the field of health services known throughout Canada as eHealth.

eHealth is defined as blending the internet, telecommunications, and information technology with medical services provision3. According to the Government of Canada, the incorporation of eHealth into the mainstream medical treatment framework has been a significant priority for Canada over the past 20 years4. Since 2010, Canada has budgeted $20 billion towards the creation of a national health infostructure allowing for advances in several aspects of healthcare service models5. Integrated technologies now allow patients to receive medical care through telemedicine applications, a major benefit for those who live in rural areas or who are homebound. It also provides a more convenient service model by allowing patients affected by rare medical disorders to receive service from distant specialists6. The recognized benefits of this system have resulted in 53% of Canadian primary-care physicians using some form of electric medical reporting technologies, up from 14% in 20007. While the improvements to medical services were widely accepted as significantly beneficial, eHealth adoption also allowed for many potential advances in Health Informatics (HI).

HI “involves the application of information technology to facilitate the creation and use of health-related data, information and knowledge8.” As well, HI serves two functions in the eHealth model. First, it is designed to improve the experience of clinical practitioners through information and knowledge management. Second, it has also helped to improve the accessibility of health information for caregivers, patients and the public9. This emphasis on HI and the resulting data collection within these medical services has led to an increase in concerns surrounding the ethics of data collection in this particularly sensitive field10.

This study aims to add to the international discussion by performing an ethical analysis of the data collection policies of Canadian live eHealth service providers and to what extent they conform to the Canadian Medical Association’s (CMA) Code of Ethics. In addressing this question, this paper will begin with a brief overview of related work in the field of data privacy pertaining to medical information to situate this study within the overall discourse on the topic.

Next, the reasoning for using the CMA’s Code of Ethics as an ethical framework and the general methodological approach will be further explained. After exploring the results of the examination of the Privacy Policies and Terms of Use agreements of several eHealth service providers, the discussion will focus on the efficacy of current policy provisions and potential future research areas within this topic.

 

Literature review 

Research analyzing the ethics of the eHealth industry began to emerge in the early 2000s.

The early literature was primarily theoretical, mostly addressing the possible implications of eHealth services and technologies as an emerging trend within the field. The progression of work in this area has grown to include more practical analyses, in addition to the theoretical studies on eHealth, as concerns about privacy, data collection, equality and quality of medical care have become more prominent. Prior studies that address data collection can be organized into four main categories. This paper will provide some background on the literature that focused on:

  • the importance of protecting privacy and confidentiality in eHealth services as an ethical imperative for the industry;
  • the need to have ethical standards and frameworks in place to regulate the collection and use of data in eHealth services;
  • how the industry defines “health data” and how this impacts ethical data collection and use, and
  • the integration of eHealth data and social media
Ethical Imperative of Privacy and Consent

Early research looked at the integration of ICT in the health field and discussed the potential implications of these technologies for patient privacy and confidentiality. Dyer framed the need for privacy as a tenet of the ethical code of the American Medical Association and discussed the emergence of eHealth technology as a threat to the patient-physician bond, which shifted to a physician-customer model11. Following this concern, other scholars focused on the need for privacy in order to ensure patients trust and understand the health system that their physicians operate within in order to feel safe providing sensitive but accurate information12 13 14 15. Chaet et al. addressed a further complication of this issue in their discussion of the potential breach of patient privacy through the use of third-party platforms stating, “websites that offer health information may not actually be as anonymous as visitors think; they may leak information to third parties through code on a website or implanted on patients’ computers16.”

Research on this topic has often been split between a focus on the need for collected information to be de-identified for the purposes of protecting individual anonymity while also emphasizing the need for flexibility in data collection to allow for advanced analysis of sensitive information for the benefit of improving the service delivery of our universal healthcare system17. Other studies look at embedding privacy in the very design of these online platforms to ensure protection of patient privacy but also to manage the custody of data and consent18.

Need for Ethical Provisions and Regulation

Studies that focused specifically on the ethical imperative of privacy within the medical services field are often related to other studies that speak to the need for regulations or ethical codes to be embedded within the eHealth framework. Wadhwa and Wright specifically address the issue that the ethics of the medical industry have always been evolving, and the move towards an eHealth model is just another example of a period when society needs to update its ethical understandings of the practice for the digital age19. Soenens was among the earlier voices on this issue stating the importance of ensuring the tenets of the Hippocratic Oath remains ingrained in eHealth services20. Other literature in this area looks beyond system design to examine physicians themselves as important privacy actors. Derse and Miller argue that physicians should only use eHealth systems that are transparent about their privacy policies and meet acceptable ethical standards of patient confidentiality21.

Another branch of research looks at the role state regulators play in the eHealth system.

Studies conducted on this topic have sought to examine how state privacy regulations in the European Union (EU) and the United States (US) have been characterized as barriers to the continued success and advances of eHealth systems22. They have also addressed the lack of regulation and the power of third-party organizations to control the management of personal health data23. Additional studies have explored the ownership of patient data in third-party platforms and the effect this has on whether medical data can be used for informed improvements to the medical care system for the benefit of the overall public good, especially in countries with publicly funded healthcare like Canada24.

Defining Health Data and Securitization

Along with analyzing medical data regulation, other studies addressed the secondary issue of how to understand and define personal health data from a regulatory perspective.

Kleinpeter highlights how recent digital devices enable the constant collection of personalized health data at an unprecedented level25. This type of collection makes it difficult for legislators to determine which types of data to regulate and how to regulate them. It would be important to make the distinction between data collected during an individual patient’s diagnosis versus anonymized data for the purpose of medical research. There may even be some types of patient data that should be legally prohibited to collect altogether.

Some critics have noted that the lack of clarity regarding ownership of this data makes it especially difficult to regulate ethically. For example, Lee noted that, “Unlike victims of breaches of financial data, to whom reparations can be made, victims of breaches of private health data cannot be ‘made whole’; information cannot be ‘taken back.’26

eHealth, Social Media, and Medical Advertising

The growth of social media platforms has become a more common theme within eHealth data discourse. As more social media components are added to eHealth services and patients are encouraged to engage, individuals are sharing their health stories and providing their medical data to commercial platforms more regularly. Even when these platforms are managed by public healthcare systems, this increased sharing of health information makes the privacy issues much more complicated. While taking a largely positive view of social media sites and eHealth services, Winkelstein also echoes some of this concern27. Other studies have specifically analyzed third-party platforms and discussed how the ethical gaps in their processes should be addressed28. The literature has also addressed situations where health data has been released with informed consent but is later provided to a third-party organization which combines that data with non-health-related data in order to be able to do targeted advertising based on a patient’s particular health conditions29.

This extensive body of work, along with the structured literature review performed by Khoja, Durrani and Nayani shows that ethical and legal issues within the field of eHealth studies are, and will likely continue to be a significant research area30. Up until now, this research has tended to focus on the United States, the United Kingdom and Europe and has been largely theoretical in nature, analyzing and critiquing the system as a whole rather than undertaking in- depth analyses of individual services. As such, this study provides a unique perspective on the ethical discourses in eHealth by focusing specifically on the data collection and terms of use practices of major eHealth services within the Canadian healthcare system.

 

Theoretical Framework

The CMA Code of Ethics will serve as the ethical framework for this investigation of the chosen eHealth services. Specifically, the “Privacy and Confidentiality” codes from the CMA’s Code of Ethics were used to measure each platform on a pass-fail basis. This Code of Ethics was chosen as the framework because of its geographic and political relevance, as it “constitutes a compilation of provisions that can provide a common ethical framework for Canadian physicians31.” As physicians operating within the Canadian healthcare system are required to follow this code, it will be used as a framework to evaluate the recent evolution of this service model. As the CMA Code of Ethics was last updated in 2004 and deemed “still relevant” after a review in March of 2018, this study will include recommendations for future adaptation and revision32.

 

Methods

Since the objective of this research is to evaluate eHealth services and their publicly available privacy policies, the researchers located the platforms through popular online search methods. All searches were performed through Google, as it is the most popular search engine among Canadians and as such, likely where users will turn for their online health information33. The search queries used to locate the eHealth platforms were “Canada online health chat,” “Canada telehealth services,” “Canada live health chat,” “virtual doctor Canada,” “online doctor Canada free,” and relevant hits that appeared within the first one to three pages were selected.

Only services that facilitated conversations (including chats or teleconferences) between trained healthcare providers, whether that was a doctor, Registered Nurse, or Registered Nurse Practitioner, were considered. Mental health helplines that featured live chats were also considered, as these platforms had similar capabilities for live chat and thus, have potential implications for the privacy of users’ personal health data. Ultimately, 18 platforms were chosen for consideration based on this search criteria: GOeVISIT, National Eating Disorder Information Centre (NEDIC), sexualhealthontario, Dialogue, Ontario Telemedicine Network (OTN), Livecare, Viva Care, Maple, Equinoxe LifeCare (EQ Care), YourDoctors.Online, Medicuro, Mental Health Helpline, youthspace.ca, Toronto Distress Centre Online Chat and Text Service (ONTX), Medeo, MDKonsult, Akira, and Ask The Doctor.

The chosen inclusion criteria allowed for both mobile applications and websites (which may license to third-party vendors) to be considered. Although technically the term “telehealth” encompasses both telephone calls as well as digital health services, telephone health services (i.e. voice calls) were excluded to narrow the scope of the project. Platforms that offered services such as video chats were included provided these services were not the sole eHealth service available on that platform. All chosen platforms had a live, online communication exchange component, whether via a pre-arranged appointment for a text conversation, a videoconference (ex. Maple) or by waiting in a queue for the next available professional (ex. youthspace.ca, Mental Health Helpline).

First, each service was examined to determine its funding model (private, public, or non- profit), if advertisements were used, if the patient was charged a fee for the service, or if the services were funded through the existing provincial healthcare system. Each service’s social media presence was also examined to determine whether they promoted their social media channels to users, required their users’ social media information or provided the option to sign-in to the service using existing social media accounts. Next, the accessibility of each platform’s privacy information was evaluated. The researchers took note of the presence (or lack thereof) of the Privacy Policy on the home page, and if the Privacy Policy or Terms of Use discussed how data were collected and used. The reading level of the Privacy Policy and the Terms of Use were also evaluated using the Flesch-Kincaid readability assessment embedded in Microsoft Word.

This reading test approximates a grade level required to understand a document, based on average sentence length and the average number of syllables per word. For example, a score of 8.0 implies that someone in 8th grade could understand the document34. Finally, each platform’s Privacy Policy was compared to each of the CMA’s “Privacy and Confidentiality” Codes to see if it was compliant35.

A potential source of error for this study is that the authors do not have authoritative documents for these organizations and services. For example, some services did not appear to have privacy documents, user agreements, or explicit methods of funding after extensive searching by the researchers, however, this does not mean that those documents do not exist; they are simply inaccessible to the public. The fact that the authors of this study who have significant training in digital literacy and researching suggests that these documents are also likely inaccessible to the general population using these services.

 

Results

Business Model

As demonstrated in Figure 1, the majority of eHealth services were privately owned and operated (12). Beyond this, the other funding models included ownership by a non-profit organization receiving government funding (4) and publicly funded services (2).


Figure 1: Business models of the surveyed eHealth services

All of the publicly funded and non-profit eHealth services examined for this study were free to Canadian users. While some private eHealth services were free, most charged a fee for access to their product. Two models emerged: a pay-per-visit model and a membership-based model. The mean cost of accessing a pay-per-visit eHealth service was $42.89 Canadian dollars (CAD) while the median cost was $45 CAD. Alternatively, subscription-based models ranged from $15 CAD to $150 CAD per-month per-member.

Only two services, Livecare and Viva Care, both privately owned and operated, utilized advertisements.

User Data Collection

Private eHealth services collected both active and passive data more frequently than public or non-profit eHealth services.

 

Figure 4: Frequency measurement of passively collected user data, grouped by business model.

 

Actively Collected User Data


Figure 5: Frequency measurement of actively collected user data, grouped by business model.

 

Social Media

Only one service (YourDoctors.Online) allowed for sign-in through a third-party social media application. While the use of social media by these eHealth services could simply be viewed as a marketing or communication tactic, having users connect their social media accounts with these services may also present a further opportunity for data collection. One service directly requested social media information from users (GOeVisit) and two had an app available for download through a Google Play or an Apple account (Livecare and Maple) which would then be connected to other social media information (i.e. Google +).

Figure 6: Depth of social media engagement that services presented, grouped based on business model.

 

Privacy Policy, Terms of Use, and the Flesch-Kincaid Reading Level Test

Seventy-two percent of the services had privacy policies that were accessible from the home page. Sexualhealthontario, MDKonsult, and Ask The Doctors had neither a Privacy Policy nor a Terms of Use agreement.

The measures of central tendency for the Privacy Policies and Terms of Use show that both provisions tended to be similarly difficult to read and were often written above a high school reading level. Privacy policy scores and measures of central tendencies were calculated based on the 14 services that had privacy policies. Sexualhealthontario, youthspace.ca, MDKonsult, and Ask The Doctor did not have privacy policies and thus were factored out.

Terms of Use scores and measures of central tendencies were calculated based on the 13 platforms that had Terms of Use agreements. Sexualhealthontario, Medicuro, Mental Health Helpline, MDKonsult, and Ask The Doctor did not have Terms of Use agreements.

One service, GOeVISIT, had some problematic data policies, such as the note that the service uses both FaceTime and Skype which are not bound by the Health Insurance Portability and Accountability Act (HIPAA) in the United States and that their data were stored by Rogers. This service did, however, make note that they employ a Privacy Officer. Whether or not the Privacy Officer offsets the possibility for data leakage is a point for ethical consideration.

Figure 7: The measures of central tendency for Privacy Policies and Terms of Use scored by the Flesch-Kincaid Reading Level Test.
Privacy Policy: Range = 8.6-18.6, median =15.1 and mean = 14.4; Terms of Use: range = 9.2- 17.9, median=14.7 and mean =15.4.

 

Canadian Medical Association Code of Ethics

In order to analyze compliance to the CMA’s Privacy Policy, the authors analyzed each service’s Privacy Policy and Terms of Use agreement to locate the following ethical code provisions (depicted in Figure 8): protection of personal health information, awareness of patient rights, avoidance of public discussion, disclosure of information to third parties only with consent, action to take steps to inform patients about responsibility to third parties, and providing patients with a copy of medical records upon request. Each item was coded as “one” if the provision was satisfied by the service or coded as “zero” if the provision was not met for any reason (either the item was not addressed in the Privacy Policy or Terms of Use Agreement or the documents were not accessible).

Figure 8: Figure 8: This chart delineates each service and its compliance or non-compliance with the CMA’s Ethical Codes relating to Privacy.
Mean CMA adherence = 4.17 provisions, Median= 5 provisions.

 

Discussion

CMA’s Privacy and Confidentiality Codes

Of the 15 services surveyed which had an explicit privacy policy, six of these met the CMA’s provisions for Privacy and Confidentiality, while the remaining nine services had privacy statements that met at least half of the CMA provisions. It is important to note that when a service-provider’s policy did not make explicit mention of one of the CMA provisions, it was read as though the service provider was not abiding by the code. While the absence of a provision in the privacy policy does not necessarily mean that the service provider is not abiding by the CMA provisions, this absence is instructive. These policies tend to be quite lengthy and thorough, and so for the clarity of these policies all privacy measures taken ought to be stated explicitly.

As for adherence to specific CMA codes, the services that were examined had a policy that affirmed their general commitment to “Protect the personal health information of [their] patients36.” This code is quite general but the unanimous adherence suggest at the very least, a basic understanding and engagement with privacy concerns. Likewise, all but one service explicitly affirmed that they would only disclose personal information with consent or as required by law and would notify patients of any breaches.

The two codes which were most often unobserved (five of 15 services not mentioning them) were avoiding public discussion of sensitive information and providing patients with a copy of their medical record upon request. LiveCare (Private) and NEDIC (Non-Profit Organization) each failed to explicitly address three of the CMA’s six codes, the most of any of the services with available privacy policies. It is important to note that there seems to be little discernible difference between public or not-for-profit organizations and private companies when it comes to observance of the CMA’s Privacy and Confidentiality Codes.

Business Model

As with any organization, understanding the funding model is intrinsic to understanding how it functions. Our primary concern regarding the funding models was borne out of the popular internet maxim, “[i]f you are not paying for it, you’re not the customer; you’re the product being sold37.” Of particular concern was the issue that if there was not a clear revenue stream for a service, then perhaps the personal health data of users was at risk of being sold.

While a service may appear to be free for the user, there is the possibility that the owners of the applications may attempt to generate revenue in other ways that the consumer is implicated in but unaware of. Even for these ostensibly “free” private healthcare services, there must be some sort of revenue stream, and if there is no cost for the user, then it is possible the selling of user data could be a source of income. Leontiadis et al.38 found that 77% of the top free mobile applications were supported through targeted ads which required access to personal information, and that 94% of these applications also requested network access, which could potentially result in data being leaked. Since free applications tend to request more privacy permissions than paid applications, it is important to understand how the current advertisement model works.

Of the eHealth services that were surveyed, two were publicly funded, run directly by the Ontario government and offered free of charge (Ontario Mental Health Helpline and Sexual Health Ontario). In these situations, the funding models were quite clear. Beyond these publicly administered services, there were four services which were run by independent, not-for-profit organizations (NEDIC, Toronto Distress Centre, youthspace.ca and OTN). Of these four not-for- profits, OTN is the only one that is funded exclusively by the provincial government. The other three services are funded with a mix of grants (public and private), private donation, and corporate sponsorship including Dove, Bell Canada and RBC for NEDIC, Hydro One Inc. and TD Bank Group for the Toronto Distress Centre and Canada Post for youthspace.ca39 40 41.

The remaining 12 services surveyed were privately funded, for-profit organizations. Of these 12, seven of the services (Maple, EQ Care, YourDoctors.online, Medicuro, MDKonsult, Akira, Ask the Doctor) charged patients a clear fee for each consultation ranging between $15 and $150. Dialogue has a similar fee-based model except, in this case, the fees are paid by the employer. Similarly, the fees for Livecare and Medeo are paid by healthcare providers in order to have access to “resources and support to help [them] maximize revenue42.” Lastly, Viva Care and GOeVisit are private companies which provide service to any patients with provincial healthcare coverage. While there is no direct fee for Canadian residents to use GOeVisit, non- Canadians can also access the service for $49.95 per consultation43. Advertisements were relatively rare with only the privately-operated Livecare and Viva Care utilizing them.

While the presence of a clear revenue stream does not preclude the selling of a user data, a lack of evidence of fee or grant based revenue streams should raise a red flag.

Accessibility of Privacy Information

As previously mentioned, the protection of personally identifiable health-related information has traditionally been held to very high standards. There are strict regulations surrounding storing and keeping physical and digital health records by medical practitioners and the advent of new technologies like telehealth and eHealth present new challenges to sensitive health information and its protection. While the policies surrounding physical medical charts might be quite straightforward, things begin to get more complex with eHealth and the inclusion of video conference recordings or chat logs.

When dealing with any sensitive information, the disclosure of privacy policies is critical to the patient’s understanding of what will be done with their information. Having a Privacy Policy and Terms of Use document visible on the homepage of a website signals to the consumer that their privacy is being considered and that the service provider is aware of the serious responsibility that comes with access to such information. While simply having a Privacy Policy and Terms of Use document does not mean that data is necessarily being dealt with appropriately, it does reflect a certain level of awareness and sensitivity.

Of the 18 surveyed services, 12 had both a Privacy Policy and a Terms of Use document, two, Medicuro, Mental Health Helpline, had only a Privacy Policy, one, Youthspace.ca, had a Terms of Use and the remaining three, MDKonsult, askthedoctor, sexualhealthontario had neither. While the presence of one, either a Privacy Policy or a Terms of Use, may be sufficient to ensure patient privacy, services that lacked any visible policies whatsoever were quite concerning.

As stated above, Derse and Miller stressed the importance of physicians only engaging with eHealth services which had defined privacy policies in order to be sure that their patients’ information would remain confidential44. This recommendation for discretion over eHealth services applies to patients as well, since a number of eHealth services examined (namely MDKonsult, Ask the Doctor and sexualhealthontario), do not disclose what will be done with the patient’s information. It is important to note that publicly funded organizations are not necessarily more forthcoming about their data use compared to private organizations as sexualhealthontario, which is a provincial government program, and lacks a privacy policy entirely.

Reading Level of the Privacy Policy and Terms of Use

An important facet of ensuring privacy policies are accessible is the reading level of the Privacy Policy documents themselves. Having a Privacy Policy and Terms of Use that can be easily found on a service’s website will not be very helpful to users if the documents are full of legal jargon that they cannot understand. Of the documents available, both the mean and the median reading level hovered around grade 14-15 or second-third year of university. While 54% of Canadians between the ages of 25-64 have a post-secondary degree, and could presumably read at this level, a discernible portion of the population would still have difficulty trying to understand these policies45.

Without the ability to read and understand these documents it is difficult for individuals to make informed decisions about their healthcare and the use of their health data. This, of course, is assuming that individuals with a post-secondary education will make it to the privacy policy in the footer of these websites. Many of these sites have sleek and eye-catching designs which require the user to scroll past long pages outlining the benefits of their services, with videos that grab the user’s attention, in order to find the privacy policies at the bottom of the webpage.

Social Media

The majority of the services that were surveyed have social media buttons linking to their various accounts. While the presence of these services on social media and their request (implied or explicit) for users to follow them is not problematic, the request for greater connection between the patient and the healthcare provider increases the amount of data that could possibly be breached.

This is especially pertinent for the services which explicitly involved third-party social media applications. YourDoctors.Online allowed for users to sign-in using an existing Facebook or Gmail account. This is problematic as it links the sensitive health data already held by YourDoctors.Online to further personally identifiable information. Similarly, both Livecare and Maple had apps downloadable through Google Play or Apple accounts again linking the existing data these services hold about a user to data from other online services. These two apps also present additional privacy concerns as they have access to information on your device including images, general storage, camera, and microphone which are implicated in gathering sensitive health information on the platform.

While it does not utilize social media specifically, GOeVisit facilitates its live consultations using Facetime or Skype. The involvement of these third parties is explained in their Privacy Policy which states:

…you assume all risks associated with disclosing your information through Skype™. You understand that, while Skype™ does not warrant that it complies with the HIPAA Security Rule, Skype™ does state that it uses well-known standards-based encryption algorithms to protect Skype™ users’ communications against unauthorized persons. You acknowledge that you have had the opportunity to review information about Skype™’s privacy, available here and its security, available here46.

While GOeVisit is compliant with each of the CMA’s six codes, sensitive health data could still be at risk when it comes into contact with third parties like Skype. Though companies like Skype use encryption, the fact that they are not in the business of securing health data specifically raises ethical concerns about the security of that data.

As more and more parties become involved in healthcare provision, the already complex issue of privacy in eHealth service becomes even more complicated. As seen above in the Skype statement, it is often assumed that the patient has taken the time to read the Privacy Policies and Terms of Use documents for the third parties involved in providing eHealth service, even if the links to these documents are not prominently displayed. Not only does a user have to locate and understand the service provider’s Privacy Policy and Terms of Use (if they are available), they must do the same for each of the third-party applications involved.

Further Research

While this study was primarily limited to a Canadian context, eHealth services are becoming more popular around the world. Further research could compare the situation in Canada with eHealth service in other countries. Investigating the difference between eHealth service in a public healthcare setting like Canada and a private system like the United States would be of particular interest. Specifically, in countries with private healthcare systems, citizens who would otherwise have to pay for healthcare may be more likely to turn to a free healthcare application. Studying the data collection and privacy policies of popular eHealth applications in these countries would also be of interest.

Additionally, another direction for future research could include a study that is qualitative in nature which seeks to explore what users of these services understand as personal health data. Such a study could explore understandings of the confidentiality of sensitive health data through the lens of a post-privacy society where data sharing is more prevalent.

 

Conclusion

In investigating the data we collected on 18 live eHealth chat services hosted in Canada, we came to a number of conclusions about the state of privacy in Canadian eHealth service.

First, that eHealth platforms which had a public-facing Privacy Policy made at least some reference to the CMA’s Privacy and Confidentiality codes. These codes were just updated in 2018 and we believe that they are a helpful starting point for any eHealth services in Canada to begin addressing their Privacy Policies. As such, we recommend that existing and future eHealth service providers strive to meet each of the six codes if they do not already.

We were concerned about the possible sale of patient data by service-providers, and we were encouraged to find that each of the services that we researched had a clear funding model in place. Though this does not necessarily mean that patient data is well protected, we did not have any immediate concerns about the sale of data.

Perhaps the greatest barrier we identified for patients in understanding how their health data would be used was a lack of accessible privacy policies. We separated this notion of accessibility into two parts, the first being the presence of policy documents that can be found on the homepage of the website. While the majority of service-providers had both Privacy Policies and a Terms of Use documents that were easily accessible, it was concerning that three services had no public-facing policy documents whatsoever.

The second part of accessibility relates to reading levels required to read and comprehend these policies. The mean reading level required to read the policy documents accessed was between Grade 14-15 meaning that the roughly 46% of Canadians between the ages of 25-64 who do not have a postsecondary education may be unable to fully understand these documents. We recommend that organizations which are interested in having patients understand their privacy policies, ensure that they are written in plain language and easily understood.

Finally, the integration of social media applications and other third-party service- providers like Skype and Facetime into eHealth services raised ethical concerns for us. While eHealth providers in Canada are bound by strict regulations these third-parties do not necessarily have the same responsibilities. When third-party involvement is unavoidable, we recommend that the primary eHealth provider be as open as possible about privacy policies.

 

Works Cited

blue_beetle. 2010. “User-driven discontent.” MetaFilter. Accessed 2018 March. https://www.metafilter.com/95152/Userdrivendiscontent#3256046.
Boyer, C. 2012. “The Internet and Health: International Approaches to Evaluating the Quality of Web-Based Health Information. In C. George, D. Whitehouse, & P. Duquenoy, eHealth: Legal, Ethical and Governance Challenges.” 245-274. Berlin: Springer.
Canada’s Health Informatics Association. 2012. ” Health Informatics Professional Core Competencies.” Canada’s Health Informatics Association. November. https://digitalhealthcanada.com/wp-content/uploads/2017/03/Health-Informatics-Core- Compet.
Canadian Medical Association. 2018. CMA Policy: CMA Code of Ethics (Update 2004). March.
Accessed March 2018. https://www.cma.ca/Assets/assets- library/document/en/advocacy/policy- research/CMA_Policy_Code_of_ethics_of_the_Canadian_Medical_Association_Update_ 2004_PD04-06-e.pdf
Carey, M. 2001. “The Internet Healthcare Coalition: eHealth Ethics Initiative.” Journal of the American Dietetic Association 101(8), 878.
Chaet, D., R. Clearfield, J. E. Sabin, and K. Skimming. 2017. ” Ethical practice in Telehealth and Telemedicine.” Journal of General Internal Medicine, 32(10), October: 1136–1140.
Denecke, K., P. Bamidis, C. Bond, E. Gabarron, M. Househ, A. Lau, and M. Hansen. 2015. “Ethical Issues of Social Media Usage in Healthcare.” Yearbook of Medical Informatics, 10(1), 137-147.
Derse, A. R., and T. E. Miller. 2008. “Net Effect: Professional and Ethical Challenges of Medicine Online.” Cambridge Quarterly of Healthcare Ethics, 453-464.
Di lorio, C. T., and F Carinci. 2013. “Privacy and Health Care Information Systems: Where is the Balance?” In eHealth: Legal, Ethical and Governance Challenges, by C. George, D. Whitehouse and P. Duquenoy, 77-105. Berlin: Springer.
Distress Centres. 2016. “Distress Centres 2016 Annual Report.” Distress Centres. Accessed March 2018. https://static1.squarespace.com/static/5a03516264b05fad2cec401c/t/5a15e64871c10b644 b17793e/1511384651949/DC-Annual-Report-2016.pdf.
Dobby, Christine. 2013. “More than 90% of Canadians Can’t Get Enough of Google Poll.” Financial Post. January 07. Accessed 2018. http://business.financialpost.com/technology/more-than-90-of- canadians-cant-get – enough-of-google-poll.
Duquenoy, P., Mekawie, N. M., & Springett, M. 2012. “Patients, Trust and Ethics in Information Privacy eHealth.” In eHealth: Legal, Ethical and Governance Challenges, by C. George, Whitehouse and P. Duquenoy, 275-295. Berlin: Springer.
Dyer, K. A. 2001. “Ethical Challenges of Medicine and Health on the Internet: A Review.”
Journal of Medical Internet Research, April-June.
Eysenbach, G., G. Yihune, K. Lampe, P. Cross, and D. Brickley. 2000. “Quality management, certification and rating of health information on the Net with MedCERTAIN: using a medPICS/RDF/XML metadata structure for implementing eHealth ethics and creating and rating of health information on the Net…” Journal of Medical Internet Research, 2E1.
Fleming, D. A., K. E. Edison, and H Pak. 2009. “Telehealth ethics.” Telemedicine Journal and e- Health: The Official Journal of the American Telemedicine Association, 797-803.
Geangu, I. P., D. A. Gârdan, and O. A. Orzan. 2014. “Medical Services Consumer Protection in the Context of eHealth Development.” Contemporary Readings in Law and Social Justice 6 (1): 473-482.
GOeVisit. 2018. “How it Works.” Accessed March 2018. https://goevisit.com/how-it-works Government of Canada. 2010. “eHealth.” Government of Canada. August 9.
https://www.canada.ca/en/health-canada/services/health-care-system/ehealth.html Information and Communications Technology Council. 2009. “eHealth in Canada Current
Trends and Future Challenges.” Information and Communications Technology Council. April. https://www.ictcctic.ca/wp- content/uploads/2012/06/ICTC_eHealthSitAnalysis_EN_04-09.pdf.
Kaplan, B., and S Litewka. 2008. “Ethical challenges of telemedicine and telehealth.” Cambridge Quarterly of Healthcare Ethics, 401-416.
Khoja, S., H. Durrani, and P. F Nayani. 2012. “Scope of Policy Issues in eHealth: Results From a Structured Literature Review.” Journal of Medical Internet Research, E34.
Kleinpeter, E. 2017. “Four Ethical Issues of “E-Health”.” Irbm, 245-249.
Lee, L. M. 2017. “Ethics and subsequent use of electronic health record data.” Journal of Biomedical Informatics, 143-146.
Leontiadis, I., C. Efstratiou, M. Picone, and C Mascolo. 2012. “Don’t kill my ads! Balancing privacy in an ad-supported mobile application market.” 12th Workshop on Mobile Computing Systems & Applications. San Diego, CA: HotMobile ’12.
Liang, B., T. L. Mackey, and K. M. Lovett. 2011. ” eHealth Ethics: The Online Medical Marketplace and Emerging Ethical Issues. Ethics in Biology, Engineering and Medicine.” 253-265.
Livecare. 2018. ” Home.” https://www.livecare.ca/connect
Michalopoulos, S. 2016. “E-health and the ‘fine line’ of big data.” Euractiv. December 15. https://www.euractiv.com/section/health-consumers/news/special-report-e-health-and- the-fine-line-of-big-data/.
Microsoft. 2018. “Test your document’s readability.” Office Support. Accessed March 2018. https://support.office.com/en-us/article/Test-your-document-s-readability-85b4969e- e80a-4777-8dd3-f7fc3c8b3fd2# toc342546558
MyCare MedTech Inc. 2018. “Privacy Policy.” https://goevisit.com/privacy-policy NEDIC. 2014. “Funding and Community Partners.” http://nedic.ca/about/funding-and-community-partners.
NEED2. 2016. “Funders.” https://need2.ca/funders/.
Razmak, J., and C. H. Bélanger. 2017. “Comparing Canadian physicians and patients on their use of e-health tools.” Technology in Society, 102-112.
Rippen, H., and R. Ahmad. 2000. “e-Health Code of Ethics.” Journal of Medical Internet Research, April: E9.
Rodwin, M. A. 2010. “Patient data: property, privacy & the public interest.” American Journal of Law & Medicine, 586-618.
Samavi, R., and T. Topaloglou. 2008. “Designing Privacy-Aware Personal Health Record.” In ER Workshops, 12-21. Berlin, Heidelberg: Springer. Soenens, E. 2008. “Identity Management Systems in Healthcare: The Issue of Patient Identifiers.” IFIP AICT 298: The Future of Identity in the Information Society, 55-66.
StatCounter. 2018. “Search Engine Market Share in Canada.” March. http://gs.statcounter.com/search-engine-market-share/all/canada.
Statistics Canada. 2013. “Canadian Internet use survey, Internet use, by age group, Internet activity, sex, level of education and household income.” Statistics Canada. October 28. http://www5.statcan.gc.ca/cansim/a26?lang=eng&retrLang=eng&id=3580153&&pattern=&stByVal=1&p1=1&p2=31&tabMode=dataTable&csid=.
—. 2017. “Education in Canada: Key results from the 2016 Census.” Statistics Canada. from https://www.statcan.gc.ca/daily-quotidien/171129/dq171129a-eng.htm.
Wadhwa, K., and D. Wright. 2012. “eHealth:Frameworks for Assessing Ethical Impacts.” In eHealth: Legal, Ethical and Governance Challenges, by C. George, D. Whitehouse and Duquenoy, 183-210. Berlin: Springer.
Webster, P. C. 2010. “Canada’s ehealth software “Tower of Babel”.” Canadian Medical Association Journal, December 14.
Whitehouse, D., and P. Duquenoy. 2008. “Applied Ethics and eHealth: Principles, Identity, and RFID.” IFIP AICT 298: The Future of Identity in the Information Society, 43-55.
Winkelstein, P. 2012. “Medicine 2.0: Ethical Challenges of Social Media for the Health Profession.” In eHealth: Legal, Ethical and Governance Challenges, by C. George, D. Whitehouse and P. Duquenoy, 227-243. Berlin: Springer.
Geangu, P., D. A. Gârdan, and O. A. Orzan. 2014. “Medical Services Consumer Protection in the Context of eHealth Development.” Contemporary Readings in Law and Social Justice 6 (1): 473-482.
Statistics Canada. 2013. “Canadian Internet use survey, Internet use, by age group, Internet activity, sex, level of education and household income.” Statistics Canada. Accessed October 28. https://goo.gl/sbzMVZ.
Geangu, Gârdan, & Orzan, 2014, p.
Government of Canada. 2010. “eHealth.” Government of Canada. Accessed August 9.https://goo.gl/JFX5bS.
Webster, P. C. 2010. “Canada’s ehealth software “Tower of Babel”.” Canadian Medical Association Journal, 182 (18). Accessed March 2018. http://www.cmaj.ca/content/182/18/1945
Chaet, D., R. Clearfield, J. E. Sabin, and K. Skimming. 2017. ” Ethical practice in Telehealth and ”
Journal of General Internal Medicine, 32(10), October: 1136–1140.
Razmak, J., and C. H. Bélanger. 2017. “Comparing Canadian physicians and patients on their use of e-health tools.” Technology in Society Volume 51: 102-112.
Canada’s Health Informatics Association. 2012. ” Health Informatics Professional Core Competencies .” Canada’s Health Informatics Association. Accessed November 2018. https://digitalhealthcanada.com
Information and Communications Technology 2009. “eHealth in Canada Current Trends and Future Challenges.” Information and Communications Technology Council p. 5. Accessed April 2018. https://goo.gl/cGgg9q
Khoja, , H. Durrani, and P. F Nayani. 2012. “Scope of Policy Issues in eHealth: Results From a Structured Literature Review.” Journal of Medical Internet Research Volume 14, Issue 1 p.E34.
Dyer, K. A. 2001. “Ethical Challenges of Medicine and Health on the Internet: A Review.” Journal of Medical Internet Research 3(2) E23.
Duquenoy, P., Mekawie, N. M., & Springett, M. 2012. “Patients, Trust and Ethics in Information Privacy eHealth.” In eHealth: Legal, Ethical and Governance Challenges, by C. George, D. Whitehouse and P. Duquenoy, 275-295. Berlin: Townsend, A., Leese, J., Adam, P., McDonald, M., Li, L. C., Kerr, S., & Backman, C. L. (2015). “eHealth,
Participatory Medicine, and Ethical Care: A Focus Group Study of Patients’ and Health Care Providers’ Use of Health-Related Information.” Journal of Medical Internet Research, 17(6), e155.
Fleming, D. A., Edison H. A, and Pak, A. 2009. “Telehealth ethics.” Telemedicine Journal and e-Health, (15)8 p. 797-803.
Kaplan, B., and Litewka, S. 2008. “Ethical challenges of telemedicine and telehealth.” Cambridge Quarterly of Healthcare Ethics, (17), Issue 4 p. 401-416.
16 Chaet, et al. 2017. p.1136–1140.
Whitehouse, D., and P. Duquenoy. 2008. “Applied Ethics and eHealth: Principles, Identity, and RFID.” IFIP AICT 298: The Future of Identity in the Information Society, 43-55.
Samavi, R., and T. Topaloglou. 2008. “Designing Privacy-Aware Personal Health Record.” In ER Workshops, 12-Berlin, Heidelberg: Springer.
Wadhwa, K., and D. Wright. 2012. “eHealth:Frameworks for Assessing Ethical Impacts.” In eHealth: Legal, Ethical and Governance Challenges, by C. George, D. Whitehouse and P. Duquenoy, 183-210. Berlin: 20 Soenens, E. 2008. “Identity Management Systems in Healthcare: The Issue of Patient Identifiers.” IFIP AICT 298: The Future of Identity in the Information Society, 55-66.
Derse, A. R., and T. E. Miller. 2008. “Net Effect: Professional and Ethical Challenges of Medicine Online.”Cambridge Quarterly of Healthcare Ethics 17(4) 453-464.
Di lorio, C. T., and F. Carinci. 2013. “Privacy and Health Care Information Systems: Where is the Balance?” In eHealth: Legal, Ethical and Governance Challenges, by C. George, D. Whitehouse and P. Duquenoy, 77-105. Berlin: Springer.
Boyer, C. 2012. “The Internet and Health: International Approaches to Evaluating the Quality of Web-Based Health Information. In C. George, D. Whitehouse, & P. Duquenoy, eHealth: Legal, Ethical and Governance Challenges.” 245-274. Berlin:
Rodwin, M. A. 2010. “Patient data: property, privacy & the public interest.” American Journal of Law & Medicine, 586-618.
Kleinpeter, E. 2017. “Four Ethical Issues of “E-Health”.” IRBM, 38(5) 245-249.
Lee, L. M. 2017. “Ethics and subsequent use of electronic health record data.” Journal of Biomedical Informatics, Volume 71 143-146.
Winkelstein, P. 2012. “Medicine 2.0: Ethical Challenges of Social Media for the Health Profession.” In eHealth: Legal, Ethical and Governance Challenges, by C. George, D. Whitehouse and P. Duquenoy, 227-243. Berlin: Springer.
Liang, B., T. L. Mackey, and K. M. Lovett. 2011.” eHealth Ethics: The Online Medical Marketplace and
Emerging Ethical Issues. Ethics in Biology, Engineering and Medicine,p 253-265; Denecke, K., P. Bamidis, C. Bond, E. Gabarron, M. Househ, A. Lau, and M. Hansen. 2015. “Ethical Issues of Social Media Usage in Healthcare.” Yearbook of Medical Informatics, 10(1), p.137-147.
Michalopoulos, S. 2016. “E-health and the ‘fine line’ of big data.” Euractiv. Accessed March https://goo.gl/PrXN2X. 
Khoja et al. 2012.
Canadian Medical Association. 2018. CMA Policy: CMA Code of Ethics (Update 2004). March. Accessed March 2018. https://goo.gl/RBkrnS
Canadian Medical Association.
2018. “Search Engine Market Share in Canada.” Accessed March 2018. http://gs.statcounter.com/search-engine-market-share/all/canada.; Dobby, Christine. 2013. “More than 90% of Canadians Can’t Get Enough of Google: Poll.” Financial Post. January 07. Accessed 2018. https://goo.gl/HtgcP2 34 Microsoft. 2018. “Test your document’s readability.” Office Support. Accessed March 2018. https://goo.gl/AcPxro.
Canadian Medical Association.
2010. “User-driven discontent.” MetaFilter. (Blog comment). Accessed 2018 March. https://www.metafilter.com/95152/Userdriven-discontent#3256046.
Leontiadis, I., C. Efstratiou, M. Picone, and C Mascolo. 2012. “Don’t kill my ads! Balancing privacy in an ad-supported mobile application market.” 12th Workshop on Mobile Computing Systems & Applications. San Diego, CA: HotMobile ’12.
2018. “How it Works.” Accessed March 2018. https://goevisit.com/how-it-works.
Distress Centres. 2016. “Distress Centres 2016 Annual Report.” Distress Centres. Accessed March https://goo.gl/JiWKud.
2016. “Funders.” https://need2.ca/funders/.
2018. ” Home.” https://www.livecare.ca/connect.
Derse, A. R., and T. E. Miller. 2008. “Net Effect: Professional and Ethical Challenges of Medicine Online”. Cambridge Quarterly of Healthcare Ethics, 453-464.
Statistics Canada. 2017. “Education in Canada: Key results from the 2016 Census.” Statistics Canada. from https://www150.statcan.gc.ca/n1/daily-quotidien/171129/dq171129a-eng.htm
MyCare MedTech Inc. 2018. “Privacy Policy.” Accessed March 2018. https://goevisit.com/privacy-policy.

Information Technology and Modern Public Service:
How to Avoid IT Project Failure and Promote Success

Scarlett Kelly

 

Table of Content

Introduction……………………………………………………………………………………………………………………. 2

Background……………………………………………………………………………………………………………………. 2

What are IT project failures and IT projects success?……………………………………………………………. 3

The factors that determine IT project failures……………………………………………………………………….. 5

People…………………………………………………………………………………………………………………. 6

Process……………………………………………………………………………………………………………….. 7

Product……………………………………………………………………………………………………………… 11

Case study……………………………………………………………………………………………………………………. 12

Recommendations for promoting IT projects success………………………………………………………….. 13

Conclusion…………………………………………………………………………………………………………………… 15

References……………………………………………………………………………………………………………………. 17

 

Introduction

Successful information technology (IT) projects have the potential to support and transform governmental functions to a higher level of efficiency and cost-effectiveness when delivering services to end-users.1 In this sense, the ideal state is that successful IT projects have the potential to improve public service functions and deliver services to citizens more efficiently and cost effectively.2 However, a global scan shows that IT projects have as high as 85% failure rates and only 15% success rates.3

The reality is that Canada seldom receives such benefits because of the repeated IT project failures and many unanswered questions behind these failures. How can we define IT project failures and successes? Who are the most important stakeholders in any IT project? Facing the negative consequences of IT project failures, especially the financial burden, is the purpose of this paper which performs an in-depth analysis of IT project failure/success factors with real-life examples, including the Phoenix pay system4 as a case study.

To answer the research question “what are the factors that contribute to the success or the failures of IT projects in governments,” this paper will examine the three key factors of IT projects — the people involved, IT processes (purpose of the project, planning, and implementation with a focus on external/internal management), and product to deliver to users. After presenting a holistic understanding of IT project failures, this paper will make actionable recommendations in the Canadian context for promoting IT project success.

Based on the research done to date, the key finding is that many of the IT projects are politically motivated and departmental staff expertise is often overlooked. Lack of leadership and sufficient in-house IT knowledge appears to have made governments make decisions by instinct, so the failure of linking IT products with the internal departmental function becomes inevitable.

 

Background

The application of IT in the government fundamentally changed the government in no less a way than the French revolution reshaped Europe or printing technology shaped the western civilization.5 6 Electronic government, or e-government provides a channel for citizens to directly communicate with the government in an online environment, which enhances citizen engagement, provides new information presentation, and consultation.7 However, IT projects do not always return benefits. The Auditor General of Canada (Auditor General) found that in the Department of National Defence (DND), IT project implementation took on average seven years after a long seven-year funding approval.8 Moreover, by 1994 about $1.2 billion out of $3.2 billion in the IT program budget was not supported by a plan that monitors expenditure and the implementation process, and $700 million could have been saved if 11 projects had been implemented based on priority.9

In addition to the technology failure, IT disruptions frequently happens because of:

  • the failure of planning, managing and decision-making when implementing IT products;
  • the lack of control and risk 10

Three main factors contribute to the IT failure or success in the government—people, process and product.11

  • People include government departments, external stakeholders, and citizens/end-users.
  • The IT project process begins with finding the right project and ends with implementing the project within a specific timeline. In other words, the IT process is a business process which includes defining the purpose of introducing IT products (the question why), listing all the desired features of the product, purchasing or manufacturing the product by establishing partnerships with the private sector, developing a detailed and complete implementation plan (financing, deadlines, evaluation, and accountability), managing the implementation both internally and externally, and delivering a functioning final product. Both senior management commitment and project management play vital roles in the IT process because of the complex relations between the public and private sectors as well as the considerations within the government and with end-users, which will be discussed in detail in the “process” section.
  • Product includes the technology selected for users and its performance

The three components are inter-related; for example, when people fail to work together and develop a common goal, the IT process will not go smoothly and meet the goals all stakeholders agreed upon. The obstacles in the IT product implementation process can lead to malfunctioning projects which directly result in the final products not meeting the needs of the stakeholders. However, this relationship is not linear. For example, any problems in the process could change the relations among the people involved, which can change the final product and the performance evaluation framework. All these will be discussed in detail in the next few sections of this paper.

 

What are IT project failures and IT project successes?

IT project success is defined as most/all stakeholder groups having attained their major goals and there are no significant undesirable outcomes of IT process and products. 12 Such success is achieved by following project schedules, staying in budget, and delivering a final IT product that functions fully as expected. No part in the people, process, and product IT project lifecycle can fail.

The opposite of IT project success, IT project failure, can be defined as project outcomes are not what stakeholders initially expected. Such failure is caused by fragmented planning and mistakes in implementation, project management, and decision-making. These may include, ignoring stakeholders’ needs or the departmental operations that are less ideal for IT product implementation and management, where control and risk prevention are lacking.13 IT project failures can result in delayed schedules, high costs, system uselessness and/or reliability problems, 14 and worst of all, the loss of public confidence in government’s ability and accountability when handling taxpayer’s money. Fear of potential IT project failure can make governments hesitate when considering introducing IT products or deciding not to adopt IT products to eliminate risks. Therefore, IT failure could result in a vicious cycle: less innovation in IT projects due to fear, less effort put into IT projects because of the repeated and perceived failure, and less chance for success.

One of the many examples of IT project failure is the Agriculture and Agri-Food Canada’s attempt to build a fully integrated system, Agconnex, for farmers which resulted in a $14 million failure.15 Another example is the Automated Land and Mineral Record System (ALMRS) system in the U.S., which aimed to improve the Bureau of Land Management’s (BLM) ability to record, maintain, and retrieve information on land description and ownership. ALMRS’s major software component—Initial Operating Capability (IOC)—failed to meet BLM’s needs and was not deployable. 16 This is because BLM failed to strengthen its investment management system acquisition processes and an overall project plan and timeline for actions.17 After spending $411 million, including $67 million spending on the IOC software development, the project was terminated in 1998.18

 

The factors that determine IT project failures

Introducing IT products to organizations is not a linear technological change but involves complex human factors. 19 Three main factors—people, process, and product—contribute to IT project failure or success in governments. 20 Any interruptions in the people, process, and product implementation stages can result in ripple effects and eventually the project’s failure. For example, when there is stakeholder resistance, reaching a common goal and developing plans becomes more difficult. Such resistance also results in delays in project scheduling and difficulty in managing the IT implementation. There are no sequential relationships: any delay in the schedule can result in difficulty in management which then affects stakeholder and user confidence, as demonstrated in the graph below.

 

People

As the initial step that leads to IT project failures/successes, the people factor includes government departments, external/private sector stakeholders, and citizens/end-users. 21 The quality of departmental collaboration, stakeholder support/resistance and end-users’ comments on product functionality requirements can all become factors in the potential IT project failure/success.

First, the quality of departmental collaboration can initiate smooth IT planning, but the lack of an effective cross-departmental IT strategy and working governance can be a problem.22 For example, the Sustainable Access in Rural India (SARI) project in Tamil Nadu, India, was an e-government project that used a Wireless-in-Local Loop (WLL) technology to provide internet connections to 39 rural villages in the region.23 A lack of clearly defined goals appeared from the beginning—the program only transformed applications from paper-based to electronic submission without changing the traditional less-transparent back-office operations.24 Such a product did not meet the end-users’ expectations; only 12 villages used the service regularly.25 Furthermore, there was no cross functional service delivery framework due to the lack of collaboration with other levels of government and the failure continued in the long run. 26 There was also no sustained public leadership and commitment, either: the involved officials and staff constantly changed and there was no consistent support for the new technology which induced changes in the traditional roles, authorities and network.27

Second, the development of IT projects often involves multiple stakeholders who have different management styles and goals. 28 Their willingness to collaborate directly affects the project success or failure. For example, the Department of Homeland Security (DHS) in the U.S. is responsible for the security of cyber space. Its programs are aimed at recovery efforts for public and private internet systems, identifying laws and regulations regarding recovery in the event of a major internet disruption, evaluating plans for recovery, and assessing challenges.29 Clarification of roles and responsibilities is crucial because in the course of internet recovery, the private sector owns and operates the majority of the internet.30 Yet there was no consensus among public and private stakeholders about what DHS’ role was or when it should get involved.31 The private sector was reluctant to share information on internet performance with the government, but the government could only take limited actions due to legal issues.32 Such lack of stakeholder engagement directly resulted in the DHS programs failure.

Third, end-users are one of the most crucial components, since their approval of service quality and product functionality indicates the success of IT projects. Meeting end-user’ performance expectations, such as system usefulness and information quality, becomes key to earning their approval on the technology deliverable. Romania’s e-government success largely depended on how its services meet the citizens’ needs of usefulness, ease of use and quality and trust of e- government services.33 In this sense, IT products’ ability to influence citizens’ choices, offer personalized services, and build trust become three pillars to make citizens accept IT products.34

 

Process

The process stage follows the engagement of the people factors. The process is a broad and complex business concept, which includes clarifying the purpose of introducing IT products agreed upon by most/all stakeholders, identifying the desired features of the product, purchasing or manufacturing the product by establishing partnerships with the private sector, developing a detailed and complete plan on how to implement the products (finance, deadline, evaluation, and accountability), managing IT product implementation both internally and externally, and delivering a functioning final product. The IT process is fundamentally a change management process due to the introduction of new technology35 which establishes the purpose/motivation of IT projects, develops plans to implement IT products, and creates procedures to manage the IT implementation internally and externally. Because IT projects often involve stakeholders from both public and private sectors (technology vendors and consultants), any conflict between the different approaches to managing IT projects can result in project failure. Therefore, the first step is to identify the purposes of the IT project by defining the different interests and identifying common goals between the different stakeholders in the same or different sectors. Understanding different interests enables all the various sectors to identify desired features of the final IT products and develop a coordinated framework to work together. If the final IT project reflects the common goals and coordination and is delivered on schedule, within budget, and has all the required functionality in the final deliverable, the IT project can be considered a success. Any failure in the process can result in the project failures.

First, the business driver —why an IT product is purchased/designed and implemented—is a strong influencer on how the product is implemented.36 For example, if an IT product is introduced because there is leftover budget money at the end of the fiscal year, instead of meeting the identified needs of end-users, the IT project runs the risk of imposing a piece of unwanted technology on and receiving negative reactions from end-users. Globally, the e-government initiative failures often happen when the political stakeholder interest determines the IT design and process, not the users’ needs.37 Political and business influences that create conflicts of interest can play significant parts in IT project failures. For example, a case study in a small town in Germany found that the municipal government faced mayoral elections and the former mayor was running for re-election. The government wished to initiate administrative reform using the e- government implementation as one of the re-election campaign themes. 38 While the project appeared to be high on the political agenda, the actual implementation was fraught with challenges because of conflicts within the project team. The following summary outlines some of the issues the project faced:

  • The differences in goal setting in the project team composed of different actors, including consultants, system engineers, researchers, and municipal public servants who had different expertise and interpretations of how municipal government functions.39
  • Potential end-users were not involved until the IT product prototype was 40
  • The accountability framework was never established, so it became unclear about who should pay for the software 41
  • The mayor actively sold the e-government solution for the election campaign, so the doubts about the project failure were 42

The system was never truly functional and only existed for election purposes. Therefore, identifying the drivers behind the IT projects is crucial to understand why IT products are implemented. Defining the reason for adopting IT products is an important step towards the coordination between different sectors and stakeholders.

Second, the planning stage is crucial in terms of the management of the implementation process that usually involves both public and private sectors. Three areas—knowledge of IT products and the selection and implementation process, the project agenda and leadership during the planning and implementation process—need special attention to avoid the symptoms of IT project failure (delay, over budget, and a malfunctioning product as discussed earlier). The lack of knowledge can result in:

  • a negative attitude by decision makers towards IT products;
  • resistance to change by end users;
  • unwillingness to accept new ideas, and
  • fear of losing power or job 43

The lack of knowledge of products and vendors may also cause concern when selecting the IT product provider/designer from the private sector prior to contracting the implementation of the IT product. 44 Without sufficient knowledge of the potential IT product and the private sector operations, monitoring, reviewing, and communicating with the private sector IT provider/designer during the implementation phase becomes difficult. 45 Therefore, delays in schedule, higher costs, and ineffective service can happen due to insufficient knowledge of the IT products, private sector operations, and knowing how to manage both. One example is the intranet system implemented for the U.S. Navy that exceeded the cost of almost 4 million dollars in 2004 and completed two years later than the planned completion date.46 47 48

 

Proposed Solution

After accumulating in-house IT knowledge and identifying policy gaps, an agenda and leadership framework should be developed as the second step in planning. An agenda in the IT process is more than a technology implementation process. It is about setting a timeline that makes IT products acceptable to end-users, along with change or transformation of organizational structure and practice in governments, and expecting on-time, under-budgeted, and functional deliverables.49 Overambitious or unrealistic agendas may be developed if people involved in the IT processes are politically or business driven and do not have commonly agreed upon goal(s). Strong leadership must be in place when developing realistic agendas. Leadership plays a crucial role in IT project planning. A revised agenda may be required to meet unexpected changes, address managerial concerns when both public and private sectors are involved in the same project, and oversee the policy area.

Lack of leadership and insufficient in-house IT knowledge, as well as gaps in IT policy/management, may be one reason the government relies on only a few oligopoly IT suppliers when outsourcing the e-government design and implementation to the private sector. 50 The mismanagement and lack of overseeing policy areas can be reflected in the Canadian government funded Canada Health Infoway, a project which failed to establish a national electronic health record (EHR) after $1.6 billion was spent between 2001 and 2011.51 A case study of the 10-year history of Canada’s e-health plan (based on reviewing national reports and documents as well as conducting interviews with 29 key stakeholders responsible for policy and strategy establishment for health IT from national and provincial organizations) identified that the lack of an e-health policy, inadequate involvement of clinicians, the lack of a business case for using EHR, inadequate regional interoperability, and inflexibility in approach were major barriers to adopting a national EHR.52

Third, the process of implementing and managing IT projects should include managing the technology provider and the relevant departments as well as end users. This is perhaps the most demanding area in the people, process, and product IT project lifecycle because of the various human factors involved in different sectors with different interests, operations, and culture. The public-private (sector) partnerships (P3) arrangement has transferred some traditional government- manned projects to the private sector, which effectively use the comparative advantage of the private sector in terms of efficiency, flexibility, risk sharing, and information asymmetry reduction.53 Successful P3s is powerful in promoting IT project success because of the common goals and different expertise that both sectors bring. For example, in Romania after 2000, an e- procurement system accommodated both public servants and private sector suppliers in its design and management. It enabled the government to manage hundreds of millions of US dollars of transactions annually and reduced opportunities for corruption due to system transparency. 54 Therefore, increasing the e-government success can lay in bridging the gaps between IT products and the reality of government through effective management of P3 to ensure that the private sector IT design matches the public sector uses.55 However, there are many other factors, such as the lack of shared vision, respect, and trust, as well as the weak consensual decision-making capabilities that may result in issues around P3 and effective external management.56

Ideally, the application of technology should enable the government to better serve the public. Many IT products are created and designed by the private-sector. However, the private sector may lack the understanding of the unique processes, systems, structures, and culture in the public sector and cannot meet the governments’ timelines and specific organizational needs in IT design and implementation. 57 Such different views on the purposes of IT products make the internal management of IT products extremely difficult because the gaps in training and staffing and the existing public-sector culture may not meet the requirements of operating IT products. Different operations can reflect in the different perspectives in recognizing risk and risk management styles. As the public sector aims to minimize risks to be more accountable with the public’s money, a risk management plan sometimes does not exist or is not able to solve problems when crises happen. Yet, in general, the private sector is more risk-taking than the public sector and the implementation of new IT products may not be risk-free. Potential risks may be reflected by neglecting the necessary transition between the government’s traditional management processes and the requirements of new management when using technology. For example, the underlining problem in Liverpool, United Kingdom (UK) and in Sheffield, UK 2002 e-voting pilot project identified the gaps between a third-party commercial private sector technology provider and the traditional voting management process in the public sector without addressing potential risks before the election.58 The e-voting controversy in Florida during the 2000 presidential election is another example of government IT failure due in part to insufficient risk identification and a risk management framework.59

 

Product

The costliest elements of the IT project are the people and the processes. Without a clearly defined plan and change management strategy the project could likely fail.

However, the IT product is what citizens often perceive as the proof of IT project failure. There are some key points that measure the functionality of IT products. For example, IT product quality measures information relevancy, creditability, accessibility, and most importantly, service quality improvement.60 Other aspects include error-free data, consistency in technical performance, and timely and secure information delivery within government. 61 Moreover, fully functional IT products should reflect the common goal that stakeholders originally agreed upon without undesirable outcomes in both the people and process steps. Malfunctioning IT products do not only include the less functional products, but also the products that do not meet the needs of the end-users or the organizations.

A few theoretical models can be used to evaluate if the IT product is a success or a failure. For example, the functionality of the IT products can be summarized using DeLone and McLeans’62 information system (IS) six success components: information quality, system quality, service quality, IS use, user satisfaction, and perceived net benefit.63 End-users’ feedback should be the core of IT product performance evaluation, since citizens/end-users are one of the most important stakeholders of e-government activities and their satisfaction is a factor in measuring e- government service success.64 Furthermore, evaluating IT products is not only an examination of the technology itself, but also a learning process from the whole IT project in order to achieve long-term continuous improvement.

There are more elements that result in the failure to deliver functional IT products in the Canadian federal government. In terms of federal government projects there is the potential for conflict between the political agenda for the software selection and implementation and the departmental end-user and/or the public needs. As mentioned in previous sections, successful projects are those which define the requirements and select and implement a product with clearly defined goals, deliverables and timelines. Projects that fail may be those where software is selected to meet political agendas or because budget money is available at year-end, as has been identified, without any attention to effective implementation schedule and end user requirements.

Therefore, regarding the functionality of IT products mentioned in the previous two paragraphs, IT products in the federal government may present various symptoms of failure such as those software products that have been purchased to meet political agendas. They could end up sitting on shelves and not implemented. The Phoenix pay system is an example of a malfunctioning product that reflects political will, which is discussed in the next section.

 

Case study

The Phoenix pay system—a malfunctioning, unfixable, and money-eating system—is an example of a current and high-profile IT project failure in the Canadian federal government. The original proposal was to create a centralized payroll management system to save approximately $700 million from the salaries of compensation advisors in each department across the county. 65 However, thousands of employees have been incorrectly paid or not paid at all since Phoenix launched in 2016.66 Currently the attempts to fix Phoenix could cost over $1 billion, while stopping its use could also cost hundreds of millions. 67 As a typical IT project failure (overbudget, malfunctioning product, and loss of public confidence), Phoenix shows a series of failures in the people, process, and IT project management lifecycle. The recent Auditor General (2018) pointed to a few key issues that resulted in the Phoenix project failure. These included a lack of project management from Public Services and Procurement Canada, inadequate engagement with and participation of departments and agencies in designing and building Phoenix, and implementing the Phoenix system when it was not ready.68

By establishing a new payroll management operating model, Phoenix brought a significant change to the complex payroll system in the federal government in areas such as staffing and skillset requirements. 69 Ostensibly, political will, or desiring immediate financial returns may have prevented a collaboration between departments in the areas of planning, communicating, and identifying and addressing risks and concerns.70 Stakeholder involvement appeared to be flawed, especially with the technology provider IBM from the early bidding stage.71 Top-down decision- making processes and influence may have resulted in neglecting to address end-users’ feedback, those staff members that used Phoenix on a daily basis.72 Accordingly, the failure to address institutional changes, stakeholder collaboration, and end-user feedback may have led to the mismatch between Phoenix and the government functions, resulting in project failure.73

After failing the initial people step, Phoenix experienced a fragmented and ill-managed IT process. While the political will leaned towards quickly implementing the software during the change of political parties, 74 other stakeholders wished to have a functional payroll system. Different purposes/motivations and agendas resulted in multi-directional planning, and implementation timelines and deliverables. In-house IT knowledge appeared to be scarce. It was also said that project leadership, risk management oversight, and external and internal management virtually did not exist. The project went ahead even after payroll advisors voiced concerns about the system defects and risks.75 Phoenix’s failure could be inevitable when the system implementation and management was not based on goals agreed by all stakeholders. In other words, the failure of both the people and the process components resulted in a payroll system that did not meet stakeholder’s expectation, created more errors than the current solutions could correct, and led to tremendous financial loss.76

 

Recommendations for promoting IT project success

 IT project success requires the collaboration between people, processes, and product. These factors are inter-related and changing due to people’s altering work environment and updates to technology. These factors make IT projects complex and vulnerable to failure. In this sense, IT projects are fragile: the success of IT projects requires every step to be agreed upon by the various people involved. Each step also has a sequential impact/influence on the next step. For example, the collaboration between government departments is a pre-requisite of establishing leadership and achieving effective external management in the P3. Effective internal and external project and change management in IT implementation procedures are key to ensure that the IT product deliverables are on time, within budget, and most importantly, functioning. In this way, any step must be fastidiously measured and managed, as demonstrated below.

To bridge the gaps between the ideal state (IT project success) and the reality (IT project failure), Goss Gilroy Inc. proposed 17 lessons to learn from Phoenix.77 78

The following recommendations will provide actionable supplements in a people, process, and product sequence to the theoretical lessons, which is not only applicable to Phoenix but future IT projects in the Canadian context. IT experts and information managers’ roles in stakeholder engagement, active participation in the IT procurement, implementation, and management process, and end-user feedback are emphasized.

The first step is to the creation of a leadership role, such as a Minister of Information, to be responsible for people management and central coordination in IT projects. Distributing IT tasks to different departments results in the lack of central power. Stakeholder engagement and common goal setting become difficult when no central power exists to coordinate and manage the complex relations among people and/or organizations. Creating a Minister of Information role is crucial79 in overseeing IT initiatives, such as identifying special needs of a wide range of stakeholders, building relations with the private sector, monitoring the planning and implementation timelines, and changing strategies based on changing priorities and circumstance. The Department of Information should be the central coordinator to effectively use other departments’ resources and establish clear accountability on IT projects. This department should include interdisciplinary experts in IT, information/data management, business/government relations, policy, economics, and psychology to link IT with other key government functions. Most importantly, the department should recognize IT experts and information managers as the most important stakeholders since IT experts and information managers’ expertise can equip the public sector with necessary IT knowledge, match their needs with IT product features, and predict technical risks. Changes in IT processes, such as change of management, communications, and in-house IT capacity may become possible after establishing strong central leadership.

The second step after establishing leadership is to initiate a change of project management responsibility in the IT process. Such change management reflects in the change in external management and internal management. In external management, successful P3s are powerful in promoting IT project success because of the different expertise and resources that both sectors bring. Strong leadership will be responsible for establishing common goals between the public and private sectors, despite the different values and methods of operation. Risk sharing in customizing IT products, instead of buy-off-the-shelf practice, must be negotiated. The successful implementation of the P3 model results in internal stakeholder support and positive organizational culture that meet the needs of introducing and operating new IT products, risk identification, and end-user considerations. In the 2013 Florida iProject, a bidding process was used to identify the top bidder with the most expertise and knowledge in software operating systems.80 Negotiations followed with the successful bidder to identify potential problems in collaboration, test design and product implementation, a method to adopt the agreed-on changes and cost proposal was developed.81 The final product was a functional, one-stop service portal in traffic congestion.82

The project became a success because of the application of the P3 model and the use of effective external management to return a product that met the public sector stakeholder’s needs.83

In internal management, it is crucial to achieve internal culture changes, to develop an understanding that IT projects are not just purchasing technical products but that they are a series in implementing business processes. Every project has potential risks, governments must proactively develop and adopt a risk detection and management framework rather than engaging in the traditional management style, i.e., that governments do not engage in risky behaviour. Any refusal to change the understanding, attitude, and approach towards IT projects may only result in the loss of control when unexpected situations happen. IT experts and information managers need to play an active role in both the external and internal management processes. Their knowledge of IT products can be useful when communicating with the private sector about the desired features of IT products and the establishment of a risk management framework. Internally, IT experts and information managers also have the knowledge of government operations, so the culture change that treats IT implementation and management as a business process could be achieved.

The U.S. Internal Revenue Service (IRS) e-file system in 1999,an electronic income tax return filing system, was a successful case in applying both external and internal management.84 Serving tens of millions of users in the U.S. each year, the IRS system is an example of long-standing successful partnership between the government and the private sector vendor specialized in tax preparation and software development.85 The IRS defined the requirements for the tax-return filing, and a private sector vendor developed software in order to enable tax filing electronically.86 Changes in the internal management of IRS were reflected in establishing strong IRS leadership and the willingness to take risks.87 Such a positive change of external and internal management helped to set a common goal, served the same interests, and created opportunities for all stakeholders. The smooth partnership makes the private sector keen to develop a partnership with IRS and understand the organizational policies and tax environment, while providing the IRS an opportunity to reengineer its tax filing process, thereby achieving a simpler, faster, and virtually error-free state as well as expanding its market.88 The IRS also received support from existing stakeholders, such as the Council for Electronic Revenue Communications Advancement and the National Association of Enrolled Agents, and attracted new expanded partners, such as third-party transmitters, credit card processors, and not-for-profit and professional groups.89

The third step is to change perceptions about IT products and raise the awareness about their implementation and ongoing support and maintenance. An IT product is not a onetime purchase. It requires expert in-house knowledge in project planning, product functionality, budget modification, policy development, software and user evaluation, and continuous product and process improvement. IT experts and information managers’ knowledge about IT products can be particularly useful for long-term IT product maintenance, risk management, and product improvement. When IT experts and information managers become active stakeholders in all stages of IT project planning, negotiation, and implementation, their consistent and insightful knowledge about every stage of the IT project could increase their influence in the decision-making for a particular IT project and future projects. With the involvement of IT experts and information managers, strong leadership should include both a top down and a bottom-up approach, as end- users should become the decision-makers of what products they will be able to use, not the political staff. Since IT experts and information managers are often end-users or work closely with end- users for technical assistance purposes, their input can be one of the most valuable components in the improvement of IT products and further IT projects. Therefore, departments should also share IT best practices, experience, and knowledge with each other to further create IT value.

 

Conclusion

IT project failure implies that the project outcomes are not what were initially expected and often come with undesirable results. IT project success implies that the project outcomes match most stakeholder groups’ original goals and that there are no significantly undesirable outcomes of IT process and products. Indicators of failure or success are driven by the project timeline, budget, and the functionality of final IT products. IT project failure is rarely due to the malfunction of technology itself but the human factors and underlying government policy at different stages of IT product implementation.90 The paper answered the research question “what are the factors that contribute to the success or the failure of IT projects in governments” through an analysis of various aspects of the people, process, and product in an IT project life cycle. Without leadership and sufficient in-house IT knowledge, governments tend to make decisions by instinct or private sector councils, so the failure of linking IT products with the internal departmental function becomes inevitable.91

In the Canadian context, the recommended three-step actions—establishing a leadership role to oversee people management and centralized coordination in IT projects, changing external and internal project management, and changing perspectives about IT products are crucial to fundamentally improve the current practice in IT product procurement and project management and reduce or even eliminate IT project failures. IT experts and information managers’ involvement in all stages of the project, from business case to implementation are crucial because IT specialists have the technology knowledge. They should be part of the decision-making process, involved in negotiations regarding technology procurement or design, and gathering end-user input for IT product improvement. Large-scale IT project failures, such as Phoenix, will not stop until there are changes to leadership and management in people, process, and product IT product selection and implementation. Only when Canada re-examines the relationships between people, IT implementation processes, and product selection and makes the necessary changes to meet the requirements of IT project success, will it realize the benefits of IT projects.

 

Bibliography

Abdullah, Nurul Aisyah Sim, Nor Laila Mohd Noor, and Emma Nuraihan Mior Ibrahim. “Contributing Factors to E-Government Service Disruptions.” Transforming Government: People, Process and Policy 10, no. 1 (2016): 120-138. http://ezproxy.library.dal.ca/login?url=https://search-proquest- com.ezproxy.library.dal.ca/docview/1774537026?accountid=10406.
Al-Mamari, Qasim, Brian Corbitt, and Victor Oyaro Gekara. “E-government Adoption in Oman: Motivating Factors from a Government Perspective.” Transforming Government: People, Process and Policy 7, no. 2 (2013): 199-224. Accessed April 19, 2017. doi: 10.1108/17506161311325369
Alenezi, Hussain, Ali Tarhini, and Sujeet Kumar Sharma. “Development of Quantitative Model to Investigate the Strategic Relationship between Information Quality and E-government Benefits.” Transforming Government: People, Process and Policy 9, no. 3 (2015): 324-
Accessed April 19, 2017. http://ezproxy.library.dal.ca/login?url=http://search.proquest.com.ezproxy.library.dal.ca/d ocview/1748862292?accountid=10406
Alshibly, and Chiong. “Customer Empowerment: Does It Influence Electronic Government Success? A Citizen-centric Perspective.” Electronic Commerce Research and Applications 14, no. 6 (2015): 393-404. doi: 10.1016/j.elerap.2015.05.003
Bagnall, James. “Bagnall: With AG’s report on Phoenix pay system looming, has government actually learned any lessons?” Ottawa Citizen, November 18, 2017. http://ottawacitizen.com/opinion/columnists/bagnall-with-ags-report-on-phoenix-pay- system-looming-has-government-actually-learned-any-lessons
Bhuiyan, Shahjahan H. “Modernizing Bangladesh Public Administration through E-governance:
Benefits and Challenges.” Government Information Quarterly 28(1) (2011): 54-65.
Accessed November 19, 2017. doi: 10.1016/j.giq.2010.04.006.
Blumenthal, A. “The long view.” Government Executive, 39 no. 8, (2007): 63 . http://ezproxy.library.dal.ca/login?url=http://search.proquest.com.ezproxy.library.dal.ca/d ocview/204321170?accountid=10406
Ciborra, Claudio. “Interpreting E-government and Development: Efficiency, Transparency or Governance at a Distance?” Information Technology & People 18, no. 3 (2005): 260-79. doi: 10.1108/09593840510615879
Colesca, Sofia Elena, and Dobrica Liliana. “E-government Adoption in Romania” Proceedings of World Academy of Science: Engineering & Technology, 44, 170-174. Accessed April 19, 2018. http://waset.org/publications/15361/e-government-adoption-in-romania
Das Aundhe, M. and Narasimhan, R. “Public Private Partnership (PPP) Outcomes in E- government – a Social Capital Explanation.” International Journal of Public Sector Management 29, no. 7 (2016): 638-58. doi: 10.1108/IJPSM-09-2015-0160
Delone, William H. and Ephraim R. McLean. “The DeLone and McLean Model of Information Systems Success: A Ten-Year Update.” Journal of Management Information Systems 19, no. 4 (2003): 9-30.
Dovifat, Angela, Martin Bruggemeier, and Klaus Lenk. “The “model of micropolitical arenas” – A framework to understand the innovation process of e-government-projects.” Information Polity: The International Journal of Government & Democracy in the Information Age 12(3) (2007): 127-138. Accessed April 19, 2017. http://web.b.ebscohost.com/ehost/pdfviewer/pdfviewer?vid=4&sid=8edccd57-d1eb- 41be-b0e6-157afe952834%40sessionmgr102
General Accounting Office (GAO). Land Management Systems: Status of BLM’s Actions to Improve Information Technology Management: Report to the Subcommittee on Interior and Related Agencies, Committee on Appropriations, House of Representatives. United States, Office, 2000.
Gillmore, Meagan. “More problems with Phoenix pay system revealed,” Rabble.ca, September 22, 2017. http://rabble.ca/news/2017/09/more-problems-phoenix-pay-system-revealed
Heeks, Richard. “E-Government as a Carrier of Context.” Journal of Public Policy 25, no. 1 (2005): 51-74. Accessed April 19, 2018. http://ezproxy.library.dal.ca/login?url=http://search.proquest.com.ezproxy.library.dal.ca/d ocview/58870342?accountid=10406
Heinrich, Erik. “The big chill: E-government poses many promises for the enlightened age. but will it save us from red tape or deliver us into evil?” Info Systems Executive, 6, no. 2 (2001): 10-13. Accessed April 19, 2018. http://www8.umoncton.ca/umcm-fass- administrationpublique/forum_2001/chill.pdf
Holden, Stephen H., and Patricia D. Fletcher. “The Virtual Value Chain and E-Government Partnership: Non-Monetary Agreements in the IRS E-File Program.” International Journal of Public Administration 28, no. 7-8 (2005): 643-664. doi: 10.1081/PAD-200064223
House of Commons Public Administration Select Committee. “Government and IT— ‘a recipe for rip-offs’: Time for a new approach,” (2011). Accessed November 19, 2017. http://www.publications.parliament.uk/pa/cm201012/cmselect/cmpubadm/715/715i.pdf
Imran, Ahmed, and Shirley Gregor. “Uncovering the Hidden Issues in E-Government Adoption in a Least Developed Country: The Case of Bangladesh.” Journal of Global Information Management, 18, no. 2 (2010): 30-56. Accessed April 19, 2018. doi: 10.4018/jgim.2010040102.
Ireton, Julie. “Phoenix payroll system doomed from the start: report,” CBC News, October 05, 2017. Accessed November 19, 2017. http://www.cbc.ca/news/canada/ottawa/phoenix- federal-government-report-lessons-1.4339476
Kelly, Scarlett. Digital Information Revolution Changes in Canada: E-Government Design, the Battle Against Illicit Drugs, and Health Care Reform. [Alberta: Lammi Publishing Inc., 2016]
Kumar, Rajendra, and Michael L. Best. “Impact and Sustainability of E-Government Services in Developing Countries: Lessons Learned from Tamil Nadu, India.” The Information Society 22, no. 1 (2006): 1-12. doi: 10.1080/01972240500388149
Lammam, Charles, Hugh MacIntyre, Jason Clemens, Milagros Palacios, and Niels Veldhuis. “Federal government failure in Canada : A review of the Auditor General’s reports, 1988- 2013.” Accessed April 19, 2017, [Vancouver, British Columbia: Fraser Institute].
Lawther, Wendell C. “The Growing Use of Competitive Negotiations to Increase Managerial Capability: The Acquisition of E-Government Services.” Public Performance & Management Review 30, no. 2 (2006): 203-20. doi: 10.2753/PMR1530-9576300204
Luna-Reyes, Luis F. and J. Ramon Gil-Garcia. “Using Institutional Theory and Dynamic Simulation to Understand Complex E-Government Phenomena.” Government Information Quarterly 28, no. 3 (2011): 329-45. Accessed April 19, 2017. doi: 10.1016/j.giq.2010.08.007
McGlinchey, D. Navy streamlines its intranet contract. (2004). Accessed April 19, 2017. www.govexec.com/dailyfed/1004/100604d1.htm
Moynihan, Donald P. “Building Secure Elections: E‐Voting, Security, and Systems Theory.” Public Administration Review 64, no. 5 (2004): 515-28. doi: 10.1111/j.1540- 6210.2004.00400.x
Office of Auditor General of Canada. “Report 1—Building and Implementing the Phoenix Pay System.” Last modified May 29, 2018. http://www.oag-bvg.gc.ca/internet/English/att e_43045.html
Osman, Anouze, Irani, Al-Ayoubi, Lee, Balcı, Medeni, and Weerakkody. “COBRA Framework to Evaluate E-government Services: A Citizen-centric Perspective.” Government Information Quarterly 31, no. 2 (2014): 243-56. doi: 10.1016/j.giq.2013.10.009
Randell, Brian. “A Computer Scientist’s Reactions to NPfIT.” Journal of Information Technology 22(3) (2007): 222-234. Accessed November 19, 2017. doi: 10.1057/palgrave.jit.2000106 Rozenblum, Ronen, Jang, Yeona, Zimlichman, Eyal, Salzberg, Claudia, Tamblyn, Melissa,
Buckeridge, David, Forster, Alan, Bates, David W, and Tamblyn, Robyn. “A Qualitative Study of Canada’s Experience with the Implementation of Electronic Health Information Technology.” CMAJ: Canadian Medical Association Journal = Journal De L’Association Medicale Canadienne 183, no. 5 (2011): E281-8. doi: 10.1503/cmaj.100856
Schuppan, Tino. “E-Government in Developing Countries: Experiences from Sub-Saharan Africa.” Government Information Quarterly 26, no. 1 (2009): 118-27. doi: 10.1016/j.giq.2008.01.006
Scotti, Monique. “With Phoenix pay system fix potentially costing $1B, union says time to pull the plug,” Global News, November 14, 2017. Accessed November 14, 2017. https://globalnews.ca/news/3859554/phoenix-pay-system-fix-cost-billion-union/
Seng, Wong Meng, Stephen Jackson, and George Philip. “Cultural Issues in Developing E- Government in Malaysia.” Behaviour & Information Technology 29(4) (2010): 423-32. Accessed November 19, 2017. doi: 10.1080/01449290903300931
Treasury Board of Canada Secretariat. “Lessons Learned from the Transformation of Pay Administration Initiative, October 2017.” Last modified October 10, 2017. https://www.canada.ca/en/treasury-board-secretariat/corporate/reports/lessons-learned- transformation-pay-administration-initiative.html#1
U.S. General Accounting Office. Information technology: Issues affecting cost impact of Navy Marine Corps intranet need to be resolved (GAO-03–33). (2002a). Accessed November 19, 2017. [Washington, DC: Government Printing Office]
Wang, Yi-Shun and Liao, Yi-Wen. “Assessing EGovernment Systems Success: A Validation of the DeLone and McLean Model of Information Systems Success.” Government Information Quarterly 25, no. 4 (2008): 717-33. doi: 10.1016/j.giq.2007.06.002
Wilshusen, G.C. “Internet Infrastructure: Challenges in Developing a Public/Private Recovery Plan.” 2006. [States: Government Accountability Office]
Xenakis, A and Macintosh, A. “Lessons learned from the e-voting pilots in the United Kingdom”. E-government: Information, technology, and transformation (Advances in management information systems; v. 17). (2010). Schnoll, Hans J Armonk, N.Y.: M.E. Sharpe.

 

Work Cited

1Shahjahan H. Bhuiyan. “Modernizing Bangladesh Public Administration through E-governance: Benefits and Challenges.” Government Information Quarterly 28(1) (2011): 54-65. Accessed April 19, 2018. doi: 10.1016/j.giq.2010.04.006.
2Shahjahan H. Bhuiyan. “Modernizing Bangladesh Public Administration through E-governance: Benefits and Challenges.” Government Information Quarterly 28(1) (2011): 54-65. Accessed November 19, 2017. doi: 10.1016/j.giq.2010.04.006.
3Richard Heeks. “E-Government as a Carrier of Context.” Journal of Public Policy 25, no. 1 (2005): 51-74. Accessed April 19, 2018. http://ezproxy.library.dal.ca/login?url=http://search.proquest.com.ezproxy.library.dal.ca/docview/58870342?acco untid=10406
4Julie Ireton. “Phoenix payroll system doomed from the start: report,” CBC News, October 05, 2017. Accessed November 19, 2017. http://www.cbc.ca/news/canada/ottawa/phoenix-federal-government-report-lessons-1.4339476
5Erik Heinrich. “The big chill: E-government poses many promises for the enlightened age. but will it save us from red tape or deliver us into evil?” Info Systems Executive, 6, no. 2 (2001): 10-13. Accessed April s19, 2018. http://www8.umoncton.ca/umcm-fass-administrationpublique/forum_2001/chill.pdf
6Scarlett Kelly. Digital Information Revolution Changes in Canada: E-Government Design, the Battle Against Illicit Drugs, and Health Care Reform. [Alberta: Lammi Publishing Inc., 2016]
7Tino Schuppan. “E-Government in Developing Countries: Experiences from Sub-Saharan Africa.” Government Information Quarterly 26, no. 1 (2009): 118-27. doi: 10.1016/j.giq.2008.01.006
8Charles Lammam, Hugh MacIntyre, Jason Clemens, Milagros Palacios, and Niels Veldhuis. “Federal government failure in Canada: A review of the Auditor General’s reports, 1988-2013.” Accessed April 19, 2017, [Vancouver, British Columbia: Fraser Institute].
9ibid
10Nurul Aisyah Sim Abdullah, Nor Laila Mohd Noor, and Emma Nuraihan Mior Ibrahim. “Contributing Factors to E- Government Service Disruptions.” Transforming Government: People, Process and Policy 10, no. 1 (2016): 120-138. http://ezproxy.library.dal.ca/login?url=https://search-proquest- com.ezproxy.library.dal.ca/docview/1774537026?accountid=10406.
11ibid
12Richard Heeks. “E-Government as a Carrier of Context.” Journal of Public Policy 25, no. 1 (2005): 51-74. Accessed April 19, 2018. http://ezproxy.library.dal.ca/login?url=http://search.proquest.com.ezproxy.library.dal.ca/docview/58870342?acco untid=10406
13Charles Lammam, Hugh MacIntyre, Jason Clemens, Milagros Palacios, and Niels Veldhuis. “Federal government failure in Canada: A review of the Auditor General’s reports, 1988-2013.” Accessed April 19, 2017, [Vancouver, British Columbia: Fraser Institute].
14Brian Randell. “A Computer Scientist’s Reactions to NPfIT.” Journal of Information Technology 22(3) (2007): 222- Accessed November 19, 2017. doi: 10.1057/palgrave.jit.2000106
15Charles Lammam, Hugh MacIntyre, Jason Clemens, Milagros Palacios, and Niels Veldhuis. “Federal government failure in Canada: A review of the Auditor General’s reports, 1988-2013.” Accessed April 19, 2017, [Vancouver, British Columbia: Fraser Institute].
16General Accounting Office (GAO). Land Management Systems: Status of BLM’s Actions to Improve Information Technology Management: Report to the Subcommittee on Interior and Related Agencies, Committee on Appropriations, House of Representatives. United States, Office, 2000.
17ibid
18ibid
19Wong Meng Seng, Stephen Jackson, and George Philip. “Cultural Issues in Developing E-Government in Malaysia.” Behaviour & Information Technology 29(4) (2010): 423-32. Accessed November 19, 2017. doi: 10.1080/01449290903300931
20Nurul Aisyah Sim Abdullah, Nor Laila Mohd Noor, and Emma Nuraihan Mior Ibrahim. “Contributing Factors to E- Government Service Disruptions.” Transforming Government: People, Process and Policy 10, no. 1 (2016): 120-138. http://ezproxy.library.dal.ca/login?url=https://search-proquest- com.ezproxy.library.dal.ca/docview/1774537026?accountid=10406.
21Wong Meng Seng, Stephen Jackson, and George Philip. “Cultural Issues in Developing E-Government in Malaysia.” Behaviour & Information Technology 29, no. 4 (2010): 423-32. Accessed November 19, 2017. doi: 10.1080/01449290903300931
22House of Commons Public Administration Select Committee. “Government and IT— ‘a recipe for rip-offs’: Time for a new approach,” (2011). Accessed November 19, 2017. http://www.publications.parliament.uk/pa/cm201012/cmselect/cmpubadm/715/715i.pdf
23Rajendra Kumar and Michael L. Best. “Impact and Sustainability of E-Government Services in Developing Countries: Lessons Learned from Tamil Nadu, India.” The Information Society 22, no. 1 (2006): 1-12. doi: 10.1080/01972240500388149
24ibid
25ibid
26ibid
27ibid
28Angela Dovifat, Martin Bruggemeier, and Klaus Lenk. “The “model of micropolitical arenas” – A framework to understand the innovation process of e-government-projects.” Information Polity: The International Journal of Government & Democracy in the Information Age 12(3) (2007): 127-138. Accessed April 19, 2017. http://web.b.ebscohost.com/ehost/pdfviewer/pdfviewer?vid=4&sid=8edccd57-d1eb-41be-b0e6- 157afe952834%40sessionmgr102
29G.C Wilshusen. “Internet Infrastructure: Challenges in Developing a Public/Private Recovery Plan.” 2006. [States: Government Accountability Office]
30ibid
31G.C Wilshusen. “Internet Infrastructure: Challenges in Developing a Public/Private Recovery Plan.” 2006. [States: Government Accountability Office]
32ibid
33Sofia Elena Colesca and Dobrica Liliana. “E-government Adoption in Romania” Proceedings of World Academy of Science: Engineering & Technology, 44, 170-174. Accessed April 19, 2018. http://waset.org/publications/15361/e-government-adoption-in-romania
34H Alshibly and R Chiong. “Customer Empowerment: Does It Influence Electronic Government Success? A Citizen-centric Perspective.” Electronic Commerce Research and Applications 14, no. 6 (2015): 393-404. doi: 10.1016/j.elerap.2015.05.003
35Ciborra, Claudio. “Interpreting E-government and Development: Efficiency, Transparency or Governance at a Distance?” Information Technology & People 18, no. 3 (2005): 260-79. doi: 10.1108/09593840510615879
36Qasim Al-Mamari, Brian Corbitt, and Victor Oyaro Gekara. “E-government Adoption in Oman: Motivating Factors from a Government Perspective.” Transforming Government: People, Process and Policy 7, no. 2 (2013): 199-224. Accessed April 19, 2017. doi: 10.1108/17506161311325369
37Richard Heeks. “E-Government as a Carrier of Context.” Journal of Public Policy 25, no. 1 (2005): 51-74. Accessed April 19, 2018. http://ezproxy.library.dal.ca/login?url=http://search.proquest.com.ezproxy.library.dal.ca/docview/58870342?acco untid=10406
38Angela Dovifat, Martin Bruggemeier, and Klaus Lenk. “The “model of micropolitical arenas” – A framework to understand the innovation process of e-government-projects.” Information Polity: The International Journal of Government & Democracy in the Information Age 12(3) (2007): 127-138. Accessed April 19, 2017. http://web.b.ebscohost.com/ehost/pdfviewer/pdfviewer?vid=4&sid=8edccd57-d1eb-41be-b0e6- 157afe952834%40sessionmgr102
39ibid
40ibid
41ibid
42ibid
43Ahmed Imran and Shirley Gregor. “Uncovering the Hidden Issues in E-Government Adoption in a Least Developed Country: The Case of Bangladesh.” Journal of Global Information Management, 18, no. 2 (2010): 30-56. Accessed April 19, 2018. doi: 10.4018/jgim.2010040102.
44Wendell C. Lawther. “The Growing Use of Competitive Negotiations to Increase Managerial Capability: The Acquisition of E-Government Services.” Public Performance & Management Review 30, no. 2 (2006): 203-20. doi: 10.2753/PMR1530-9576300204
45ibid
46ibid
47D. McGlinchey. Navy streamlines its intranet contract. (2004). Accessed April 19, 2017. www.govexec.com/dailyfed/1004/100604d1.htm
48U.S. General Accounting Office. Information technology: Issues affecting cost impact of Navy Marine Corps intranet need to be resolved (GAO-03–33). (2002a). Accessed November 19, 2017. [Washington, DC: Government Printing Office]
49Ahmed Imran and Shirley Gregor. “Uncovering the Hidden Issues in E-Government Adoption in a Least Developed Country: The Case of Bangladesh.” Journal of Global Information Management, 18, no. 2 (2010): 30-56. Accessed April 19, 2018. doi: 10.4018/jgim.2010040102.
50House of Commons Public Administration Select Committee. “Government and IT— ‘a recipe for rip-offs’: Time for a new approach,” (2011). Accessed November 19, 2017. http://www.publications.parliament.uk/pa/cm201012/cmselect/cmpubadm/715/715i.pdf
51Ronen Rozenblum, Jang, Yeona, Zimlichman, Eyal, Salzberg, Claudia, Tamblyn, Melissa, Buckeridge, David, Forster, Alan, Bates, David W, and Tamblyn, Robyn. “A Qualitative Study of Canada’s Experience with the Implementation of Electronic Health Information Technology.” CMAJ: Canadian Medical Association Journal = Journal De L’Association Medicale Canadienne 183, no. 5 (2011): E281-8. doi: 10.1503/cmaj.100856
52ibid
53M. Das Aundhe and Narasimhan, R. “Public Private Partnership (PPP) Outcomes in E-government – a Social Capital Explanation.” International Journal of Public Sector Management 29, no. 7 (2016): 638-58. doi: 10.1108/IJPSM-09-2015-0160
54Richard Heeks. “E-Government as a Carrier of Context.” Journal of Public Policy 25, no. 1 (2005): 51-74. Accessed April 19, 2018. http://ezproxy.library.dal.ca/login?url=http://search.proquest.com.ezproxy.library.dal.ca/docview/58870342?acco untid=10406
55ibid
56M. Das Aundhe and Narasimhan, R. “Public Private Partnership (PPP) Outcomes in E-government – a Social Capital Explanation.” International Journal of Public Sector Management 29, no. 7 (2016): 638-58. doi: 10.1108/IJPSM-09-2015-0160
57Richard Heeks. “E-Government as a Carrier of Context.” Journal of Public Policy 25, no. 1 (2005): 51-74. Accessed April 19, 2018. http://ezproxy.library.dal.ca/login?url=http://search.proquest.com.ezproxy.library.dal.ca/docview/58870342?acco untid=10406
58A. Xenakis and Macintosh, A. “Lessons learned from the e-voting pilots in the United Kingdom”. E-government: Information, technology, and transformation (Advances in management information systems; v. 17). (2010). Schnoll, Hans J Armonk, N.Y.: M.E. Sharpe.
59Moynihan, Donald P. “Building Secure Elections: E‐Voting, Security, and Systems Theory.” Public Administration Review 64, no. 5 (2004): 515-28. doi: 10.1111/j.1540-6210.2004.00400.x
60Hussain Alenezi, Ali Tarhini, and Sujeet Kumar Sharma. “Development of Quantitative Model to Investigate the Strategic Relationship between Information Quality and E-government Benefits.” Transforming Government: People, Process and Policy 9, no. 3 (2015): 324-51. Accessed April 19, 2017. http://ezproxy.library.dal.ca/login url=http://search.proquest.com.ezproxy.library.dal.ca/docview/1748862292?ac countid=10406
61ibid
62William H. Delone and Ephraim R. McLean. “The DeLone and McLean Model of Information Systems Success: A Ten-Year Update.” Journal of Management Information Systems 19, no. 4 (2003): 9-30.
63Wang, Yi-Shun and Liao, Yi-Wen. “Assessing EGovernment Systems Success: A Validation of the DeLone and McLean Model of Information Systems Success.” Government Information Quarterly 25, no. 4 (2008): 717-33. doi: 10.1016/j.giq.2007.06.002
64Osman, Anouze, Irani, Al-Ayoubi, Lee, Balcı, Medeni, and Weerakkody. “COBRA Framework to Evaluate E-government Services: A Citizen-centric Perspective.” Government Information Quarterly 31, no. 2 (2014): 243-56. doi: 10.1016/j.giq.2013.10.009
65Julie Ireton. “Phoenix payroll system doomed from the start: report,” CBC News, October 05, 2017. Accessed November 19, 2017. http://www.cbc.ca/news/canada/ottawa/phoenix-federal-government-report-lessons-1.4339476
66ibid
67Monique Scotti. “With Phoenix pay system fix potentially costing $1B, union says time to pull the plug,” Global News, November 14, 2017. Accessed November 14, 2017. https://globalnews.ca/news/3859554/phoenix-pay- system-fix-cost-billion-union/
68Office of Auditor General of Canada. “Report 1—Building and Implementing the Phoenix Pay System.” Last modified May 29, 2018. http://www.oag-bvg.gc.ca/internet/English/att e_43045.html
69Treasury Board of Canada Secretariat. “Lessons Learned from the Transformation of Pay Administration Initiative, October 2017.” Last modified October 10, 2017. https://www.canada.ca/en/treasury-board- secretariat/corporate/reports/lessons-learned-transformation-pay-administration-initiative.html#1
70Meagan Gillmore. “More problems with Phoenix pay system revealed,” Rabble.ca, September 22, 2017. http://rabble.ca/news/2017/09/more-problems-phoenix-pay-system-revealed
71ibid
72James Bagnall. “Bagnall: With AG’s report on Phoenix pay system looming, has government actually learned any lessons?” Ottawa Citizen, November 18, 2017. http://ottawacitizen.com/opinion/columnists/bagnall-with-ags- report-on-phoenix-pay-system-looming-has-government-actually-learned-any-lessons
73Luis F. Luna-Reyes, and J. Ramon Gil-Garcia. “Using Institutional Theory and Dynamic Simulation to Understand Complex E-Government Phenomena.” Government Information Quarterly 28, no. 3 (2011): 329-45. Accessed April 19, 2017. doi: 10.1016/j.giq.2010.08.007
74Monique Scotti. “With Phoenix pay system fix potentially costing $1B, union says time to pull the plug,” Global News, November 14, 2017. Accessed November 14, 2017. https://globalnews.ca/news/3859554/phoenix-pay- system-fix-cost-billion-union/
75James Bagnall. “Bagnall: With AG’s report on Phoenix pay system looming, has government actually learned any lessons?” Ottawa Citizen, November 18, 2017. http://ottawacitizen.com/opinion/columnists/bagnall-with-ags- report-on-phoenix-pay-system-looming-has-government-actually-learned-any-lessons
76Meagan Gillmore. “More problems with Phoenix pay system revealed,” Rabble.ca, September 22, 2017. http://rabble.ca/news/2017/09/more-problems-phoenix-pay-system-revealed
77Angela Dovifat, Martin Bruggemeier, and Klaus Lenk. “The “model of micropolitical arenas” – A framework to understand the innovation process of e-government-projects.” Information Polity: The International Journal of Government & Democracy in the Information Age 12(3) (2007): 127-138. Accessed April 19, 2017. http://web.b.ebscohost.com/ehost/pdfviewer/pdfviewer?vid=4&sid=8edccd57-d1eb-41be-b0e6- 157afe952834%40sessionmgr102
78Treasury Board of Canada Secretariat. “Lessons Learned from the Transformation of Pay Administration Initiative, October 2017.” Last modified October 10, 2017. https://www.canada.ca/en/treasury-board- secretariat/corporate/reports/lessons-learned-transformation-pay-administration-initiative.html#1
79Scarlett Kelly. Digital Information Revolution Changes in Canada: E-Government Design, the Battle Against Illicit Drugs, and Health Care Reform. [Alberta: Lammi Publishing Inc., 2016]
80Wendell C. Lawther. “The Growing Use of Competitive Negotiations to Increase Managerial Capability: The Acquisition of E-Government Services.” Public Performance & Management Review 30, no. 2 (2006): 203-20. doi: 10.2753/PMR1530-9576300204
81ibid
82ibid
83Wendell C. Lawther. “The Growing Use of Competitive Negotiations to Increase Managerial Capability: The Acquisition of E-Government Services.” Public Performance & Management Review 30, no. 2 (2006): 203-20. doi: 10.2753/PMR1530-9576300204
84Stephen H. Holden and Patricia D. Fletcher. “The Virtual Value Chain and E-Government Partnership: Non-Monetary Agreements in the IRS E-File Program.” International Journal of Public Administration 28, no. 7-8 (2005): 643-664. doi: 10.1081/PAD-200064223
85ibid
86ibid
87ibid
88ibid
89ibid
90House of Commons Public Administration Select Committee. “Government and IT— ‘a recipe for rip-offs’: Time for a new approach,” (2011). Accessed November 19, 2017. http://www.publications.parliament.uk/pa/cm201012/cmselect/cmpubadm/715/715i.pdf
91Blumenthal, A. “The long view.” Government Executive, 39 no. 8, (2007): 63. http://ezproxy.library.dal.ca/login?url=http://search.proquest.com.ezproxy.library.dal.ca/docview/204321170?acc ountid=10406

Defending Traditional Rights in the Digital Age:
Ktunaxa Nation Case Study

by Michelle Barroca, BA, MAS

 

Abstract

This paper provides a case study of the Ktunaxa Nation’s experience with using business and cultural records to legally defend traditional Ktunaxa rights in opposing permanent development in a culturally significant and ecologically sensitive area in the southeast corner of British Columbia.

In 2016, the Ktunaxa Nation v. British Columbia case became the first Indigenous Spirituality and Freedom of Religion case to be heard in the Supreme Court of Canada. The Ktunaxa argued that the proposed location of a year-round ski resort in the Jumbo Valley would cause irreparable harm to their culture and spiritual beliefs.

Despite supplying decades’ worth of documentary evidence, including affidavits containing closely-held Ktunaxa spiritual beliefs, the Crown determined that the proposed Jumbo Glacier Resort project does not infringe upon the Ktunaxa Nation’s right to religious practices. How is this possible when Ktunaxa spirituality, like that of other First Nations, is directly connected to the land?

Using the Ktunaxa Nation’s legal experience as the basis of this study, the author will highlight the challenges associated with documenting and using aboriginal traditional knowledge in the Canadian court system.

The content of this case study was presented by Michelle Barroca and Margaret Teneese at the ARMA Canada conference in Vancouver, BC on May 28, 2018, in addition to the ARMA Vancouver Island Chapter/Archives Association of BC conference in Victoria, BC on April 28, 2017.

 

Introduction to the Ktunaxa Nation

The Ktunaxa (pronounced “k-too-na-ha”), also known as Kootenay, Kootenai or Kutenai Indians, have inhabited the Kootenay and Columbia River Basins for more than 10,000 years. The Traditional Territory of the Ktunaxa Nation is approximately 70,000 sq. km within the Kootenay region of southeastern British Columbia, and historically included parts of Alberta, Montana, Idaho and Washington. Maps are provided on the following page for reference.

Ktunaxa citizenship is comprised of approximately 1500 Nation members from seven Bands situated throughout Ktunaxa Traditional Territory. Five Bands are located in British Columbia, Canada, and two are located in Idaho and Montana in the United States.

Contact occurred within the past 200 years as European explorers moved westward towards the Rocky Mountains and into Traditional Ktunaxa Territory. Permanent settlement of non-aboriginal inhabitants in the East Kootenay region followed the discovery of gold near Fort Steele in the late 1860s.

In 1886, the reserve system was implemented under the Indian Act, creating the following Ktunaxa communities in British Columbia:

  • kyaknuqǂiʔit – Shuswap Indian Band (Invermere)

Note: Many Ktunaxa descendants are registered at the Shuswap Indian Band, which opted out of the Ktunaxa Nation Council in 2005.

  • ʔakisq̓ nuk First Nation (formerly Columbia Lake Indian Band, Windermere)
  • ʔakink̓ umǂasnuqǂiʔit – Tobacco Plains Indian Band (Grasmere)
  • ʔaq̓ am – St. Mary’s Indian Band (Cranbrook)
  • yaqan nuykiy – Lower Kootenay Indian Band (Creston) 1

The Ktunaxa language is unique among indigenous linguistic groups in North America. The written form of the language was only formally adopted in the 1970s, following the closure of the St. Eugene Mission Residential School in Cranbrook. Ktunaxa have place names for landmarks throughout the Traditional Territory and numerous heritage sites confirm this region as traditional Ktunaxa lands.

 

Organizational Context 

In preparation of the closure of the Residential School and the Indian Affairs Office in Cranbrook, BC, the Kootenay Indian District Council originated in the 1960s to promote the political and social development of the Ktunaxa Nation.

Since 1982, the Ktunaxa Nation Council (KNC) has been incorporated as a registered non-profit society under the BC Societies Act and is operating as a government in anticipation of self-governance. KNC has been involved with treaty negotiations with Canada and BC since 1993 and is currently working under a revised approach towards self-government under the Rights Recognition and Reconciliation Framework. This new Core Treaty Model no longer includes “full and final” language.

KNC currently employs approximately 120 employees and contractors, and is comprised of a sector model that is representative of the poles of a teepee:

  • Economic Investment Sector
  • Education and Employment Sector
  • Lands and Resources Sector
  • Social Sector (includes Health services)
  • Traditional Knowledge and Language Sector (includes Archives)
  • Core Services, includes:
    • Communications
    • Facilities Management
    • Finance
    • Human Resources
    • Information Technology
    • Records and Information Management

Each Sector is accountable to a Sector Council, which consists of elected representatives from each of the four Ktunaxa Nation communities. The main decision- making body, the Ktunaxa Nation Executive Council, consists of the Chiefs of the four Ktunaxa Nation communities (represented by KNC), the five Sector Council Chairs and the Ktunaxa Nation Chair.

 

Managing KNC Records

As information management professionals, we are keenly aware of the values associated with the records in our care. Most commonly, records are kept for administrative, operational, fiscal, legal and informational value to an organization. The complexities of managing recorded information in a First Nations environment areheightened by such variables as funding cycles, limited staff capacity, ever-changing technology, as well as the need to integrate intangible cultural resources, such as oral traditions and spiritual beliefs, into official recordkeeping systems.

Once traditional knowledge is documented and maintained as records, the challenges are not focused so much on the technology itself, but rather on how to leverage technology in order to ensure that access permissions are managed appropriately.

Every First Nation will need to determine what types of cultural information may be made available to whom, and when. Categories of sensitive cultural information are being identified in KNC policy with input from Nation members and Elders.

The Ktunaxa Nation Council places great importance on effectively managing its information resources and has been working towards a corporate-wide Records and Information Management Program over the past decade. Early commitment to information management began in 1997 when the Ktunaxa Nation Archives was established to support treaty negotiations.

As the official repository for all KNC inactive records with enduring value, the Archives operates in close connection with the Records and Information Management Program. The Archives contains KNC business and Ktunaxa cultural records that date back to pre-Confederation in 1867. The Archives provides access services by appointment during regular business hours. Research requests are typically received from KNC staff and contractors, community members, as well as authors, students and lawyers. The Archives has a request process in place that is guided by an Access and Use Policy and related procedures.

Active recordkeeping activities within offices reflect a hybrid environment, with staff typically relying on electronic documents for convenience, while treating paper records as “official” records. Staff tend to rely on their email systems and shared network drives for storing and accessing records. The challenge with these practices is that records are not necessarily available to others in KNC who need to access the information.

To improve the systematic control of KNC records, an electronic document management system has been in use since 2007 to store and access important KNC records across the organization, including Council meeting records, resolutions, executed agreements, policies, Traditional Use Studies, and transcribed interviews with Ktunaxa members. Security permissions are applied to the folder structure to ensure appropriate protection of sensitive and confidential business and cultural information. A project is currently underway to replace the system with a comprehensive electronic document and records management system (EDRMS) that will have robust records management functionality while being focussed on providing a positive end-user experience.

Digitization efforts have been led by the Ktunaxa Nation Archives and the Records and Information Management Program, although all KNC employees are able to scan documents on demand using optical character recognition (OCR) software from desktop scanners and multi-purpose printers. The majority of users send scans via email or save them to shared drives, while key records are saved to the electronic document management system.

The business and cultural records of KNC, like other First Nations, are intimately linked and cannot be physically separated into strict “business” and “cultural” categories.

Since recorded information is stored in various physical and electronic places, including offices, email systems and networked file shares, intellectual control is the key to managing KNC’s records. Finding aids, such as box inventory lists, are vital to identifying and locating records stored across the organization. Access to the finding aids is generally restricted to records staff due to confidentiality and privacy.

In order to accurately and effectively manage the records of a First Nation, the Archivist and/or Records Manager must be well informed of the content and context of the records. At the Ktunaxa Nation Council, sensitive cultural information is closely guarded and not necessarily shared with all Ktunaxa members. For instance, certain songs, dances and ceremonies are believed to cause physical harm to a person when provided out of context or at the wrong time. It is imperative to listen to the First Nations community with respect to the types of information that must be protected, and when it may be disclosed.

 

Overview of the Disputed Project

Supported by foreign investors, the Pheidias Project Management Corporation of Vancouver, BC, submitted a proposal to the Province of British Columbia in 1991 to convert 6000 hectares of public land into a year-round, permanent, European style ski resort in the remote Jumbo Valley.2

A map of the project location with a design image of the proposed resort is provided below.3


The Jumbo Valley area is adjacent to the Purcell Wilderness Conservancy Park and is currently devoid of year-round road access. The project intends to replace the existing helicopter skiing access with a network of 23 ski lifts across four glaciers at an elevation of 3,400 metres, which are indeed moving. Limited construction has been completed to date, with concrete slabs for the day lodge placed in a known avalanche path.4

In this case, the location of the proposed Jumbo Glacier Resort project is the central issue of the dispute. The Jumbo Valley site was originally identified in the 1982 Federal-Provincial study entitled “The BC Rocky Mountain Tourism Region” as one of the two locations that would be suitable for destination ski resorts in the East Kootenays.5 Consultation with the Ktunaxa Nation was not part of the study.

The Ktunaxa Nation is not generally opposed to development within their Traditional Territory and actively collaborates with government and industry. In this particular case, however, the Ktunaxa have been steadfastly opposed to the proposed Jumbo Glacier Resort project since its inception for ecological, cultural and spiritual reasons.

Qat’muk (pronounced “got-mook”) is the Ktunaxa name of the lands in the central part of the Purcell Mountains that include the area of the proposed Jumbo Glacier Resort. The location is at the core of the territory of the Ktunaxa Nation and is the home of Grizzly Bear Spirit, which is a unique and indispensable source of collective as well as individual guidance, strength and protection. For the Ktunaxa, the importance of Qat’muk for Grizzly Bear Spirit is indivisibly linked with its importance for living grizzly bears, now and into the future. 6

Since the project was proposed in 1991, the Ktunaxa Nation chose to withhold detailed spiritual information regarding Qat’muk and to rely largely on scientific findings to support their position to oppose permanent development in the Jumbo Valley. Various studies were conducted regarding the potential impact to grizzly bear populations, biodiversity and water quality as a result of the Jumbo Glacier Resort project development.7 According to Kathryn Teneese, “The science was already saying the project was wrong and shouldn’t proceed. We didn’t think we needed to divulge our intimate spiritual beliefs, too.”8

Unfortunately, the Ktunaxa have been criticized for not sharing their spiritual information in detail from the outset.

As consultation with the Province continued through the 2000s, discussions tended to focus on finding “common ground” with the Ktunaxa Nation regarding the proposed ski development. Although the Ktunaxa Nation is completely opposed to permanent development in the Jumbo Valley, the Province and the Proponent tried to mitigate the proposed Jumbo Glacier Resort development by reducing the overall size of the proposed resort. Missing the point that the location, not size of the resort, is at issue, the Ktunaxa Nation decided that it was time to share their intimate spiritual beliefs to protect the home of Grizzly Bear Spirit.

Early in 2009, a Ktunaxa Chief came forward to share his recent experience with the Grizzly Bear Spirit with provincial decision makers. The Ktunaxa Nation formally requested a meeting with the Premier, who declined the invitation. In September of that year, a small number of Ktunaxa members and KNC staff, met with Cabinet Minister Krueger and his Deputy Minister in confidence. Chief Luke shared his story completely in the Ktunaxa language and an interpreter was present to provide an English translation. It seemed to the Ktunaxa that the Minister was sympathetic to Ktunaxa beliefs and felt optimistic that meaningful negotiations would continue.9

Shortly afterwards, a Provincial Cabinet shuffle occurred and Minister Thomson replaced Minister Krueger as the Minister of Tourism, Culture and the Arts. In attempts to continue strengthening the relationship between the Ktunaxa Nation and the Province, the Ktunaxa Nation invited the new Minister to visit Cranbrook and meet with the Ktunaxa Nation. In 2011, Chief Luke shared his same story nearly word-for-word in Ktunaxa with Minister Thompson in confidence, with assistance from the same Ktunaxa interpreter.10

Disappointingly, Minister Thomson approved the Master Development Agreement with the Proponent the following year and construction was permitted to begin in the Jumbo Valley. Absolutely no mention was made to Ktunaxa spirituality or to Section 2 Charter rights in the Minister’s decision to approve the Agreement. As a result, the Ktunaxa Nation initiated legal proceedings against the Ministry in the BC Supreme Court.

A brief chronology of key events from 1991 to present is provided for reference in the appendix at the end of the case study. The key activities and decisions serve to illustrate the different perspectives between the Ktunaxa Nation and the BC Provincial government regarding land development and meaningful consultation over the past three decades.

 

Legal Context

The Canadian legal system is based on case law. Some important precedents have been set for First Nations by the Delgamuukw and Tsilhoqot’in Supreme Court of Canada cases, to name only a couple. A number of excellent sources have explored these cases in detail, and only a high-level discussion will be provided here.

In Delgamuukw v. British Columbia, the Court gave greater weight to oral history of the Gitxsan and Wet’suwet’en people than to written evidence. Despite this determination, oral history was confined to the evidence phase only.11

The Tsilhoqot’in v. British Columbia case broke new legal ground when the Court awarded aboriginal title to Indigenous Protected Areas spanning 1750 sq. km in the British Columbia interior. This decision recognized broad aboriginal ownership to general areas and is not “site specific.” 12

On the surface, these decisions seem to provide strong legal precedents that would support the Ktunaxa Nation in their opposition to permanent development in Qat’muk. In reality, Canadian case law and common law have proven to be inadequate mechanisms for addressing aboriginal rights and spiritual beliefs.

The intrinsic dichotomy between predominantly Western and Indigenous ideology as it pertains to the Canadian legal system has been described as follows:

There are wide cultural gulfs between what Canadian law courts and aboriginal laws most trust. In Canadian courts, there is a heavy reliance upon sworn firsthand accounts, facts established by scientific methodology, probabilities demonstrated by statistical surveys, interpretations of the wording of textual business records, and the opinions of expert professionals. Canadian legal culture prefers that transactions be substantiated by signed and dated documents. It mistrusts hearsay. It is no surprise to Canadian courts when its witnesses, who promise God to tell the truth, lie. Most Canadians live in a predominantly literate and visually-oriented culture and their courts reflect this focus by their dependence on written proof and eyewitness testimony.13

Oral traditions are fundamental to the way First Nations document the past and interpret the present. They are integral to understanding Aboriginal Rights and Title. It is a collective responsibility amongst First Nations Chiefs, Elders and citizens to learn oral traditions, such as laws, stories, songs and dances, and to transmit that knowledge to next generations.  Credibility and respect are given to oral traditions as accounts are told and retold, challenged and reconfirmed by First Nations community members throughout the centuries. Sadly, “when presenting traditional aboriginal evidence in Canadian courts, the system for creating the evidence is at odds with the system charged with evaluating it.”14

The general reliance on written documentary evidence over oral traditions in court is greatly concerning for First Nations.  In addition to having their traditional knowledge and beliefs recorded in an affidavit and dismissed as hearsay, the many writings of early anthropologists, historians and even early government officials, often include significant errors and continue to perpetuate misinformation. Moreover, most aboriginal oral traditions must be translated in order to be admitted as evidence in court, where much of the meaning is lost in the process. Serious consideration must be given to oral traditions within the legal context in order for First Nations to participate equally in court.

Regardless of the potential legal implications, the Ktunaxa Nation is actively documenting oral traditions as a mechanism for retaining and transmitting traditional knowledge. Meetings with the Elders Advisory Committee, for example, are being digitally recorded to preserve the invaluable information provided by Elders. The recordings are stored on a dedicated server with restricted access by Traditional Knowledge and Language staff. Potential uses of the information include developing language curriculum and Ktunaxa cultural publications, as well as providing documentary evidence to support the Ktunaxa in defending aboriginal rights.

 

Supreme Court of Canada Case

The Ktunaxa Nation have put on record through the consultation process and with the BC Supreme Court, the BC Court of Appeal and the Supreme Court of Canada, the clear history of opposition to the proposed Jumbo Glacier Resort based principally on the probable effects of the project on foundational Ktunaxa spiritual and cultural values.15

After four years of litigation in the British Columbia court system, the Ktunaxa Nation petitioned for their case to be heard in the Supreme Court of Canada.  On December 1, 2016, the Ktunaxa Nation made their mark on Canadian legal history by bringing the first Indigenous Spirituality and Freedom of Religion case in Canada.

Despite significant public support, the Ktunaxa Nation was unsuccessful at arguing that the Minister’s 2012 decision to allow the Glacier Resorts project to proceed violates their right to freedom of conscience and religion protected by s. 2(a) of the Canadian Charter of Rights and Freedoms. This claim was asserted independently from the

Ktunaxa’s s. 35 claim, in which aboriginal rights to practice its religion are recognized and affirmed in the 1982 Canadian Constitution Act (Section 35(1)).16

The Supreme Court’s position can be illustrated using the following consecutive excerpts from the November 2, 2017 “landslide decision”:

…This case is not concerned with either the freedom to hold a religious belief or to manifest that belief. The claim is rather that s. 2(a) of the Charter protects the presence of Grizzly Bear Spirit in Qat’muk. This is a novel claim and invites this Court to extend s. 2(a) beyond the scope recognized in our law.17

We would decline this invitation. The state’s duty under s. 2(a) is not to protect the object of beliefs, such as Grizzly Bear Spirit. Rather, the state’s duty is to protect everyone’s freedom to hold such beliefs and to manifest them in worship and practice or by teaching and dissemination. In short, the Charter protects the freedom to worship, but does not protect the spiritual focal point of worship…In this case, however, the appellants are not seeking protection for the freedom to believe in Grizzly Bear Spirit or to pursue practices related to it. Rather, they seek to protect Grizzly Bear Spirit itself and the subjective spiritual meaning they derive from it. That claim is beyond the scope of s. 2(a).18

It would seem that the Supreme Court Justices misunderstood the Ktunaxa Nation’s argument. The Ktunaxa were not seeking protection for the freedom to believe in Grizzly Bear Spirit or to pursue practices related to it. Rather, they sought to protect Grizzly Bear Spirit itself and the spiritual meaning derived from it. It is impossible for the Ktunaxa to separate Qat’muk from the Grizzly Bear Spirit. If this claim is beyond the scope of s.2(a) of the Charter, what hope does any First Nations have to protect their sacred areas from destruction in the name of development?

In the words of Kathryn Teneese, Ktunaxa Nation Chair, “With this decision, the Supreme Court of Canada is telling every indigenous person in Canada that your culture, history and spirituality, all deeply linked to the land, are not worthy of legal protection from the constant threat of destruction.”19

Legal counsel for the Ktunaxa Nation, Karenna Williams of Grant Huberman, goes on to say that:

The Ktunaxa people have been private or “secretive” with their spiritual beliefs. This is both because of what is called for within their own religious beliefs, and because it is was required in order for the Ktunaxa people and the Ktunaxa religion to exist within Canada. Sharing information with the Provincial government, beginning a judicial review, appearing before the Supreme Court of Canada with this most personal, secret, and sacred information showed a great deal of faith on the part of the Ktunaxa. The willingness to expose the most sacred part of what it means to be Ktunaxa was not taken lightly. The Ktunaxa did so with the hope that the Court would, earnestly, and in good faith, see them. Hear them. The majority’s decision shows clearly that the Ktunaxa people were alone in their efforts to meet the Court and other Canadians halfway. It shows a profound failure for the Court to see the Ktunaxa people at all, let alone as people whose values and beliefs deserve respect and protection under the Charter of Rights and Freedoms.20

On the plus side, two dissenting Judges noted an inextricable connection between land and spiritual beliefs. This small comfort to the Ktunaxa Nation may yet prove to be an important legal instrument for other First Nations in Canada, as new law tends to be built on dissenting decisions.

Lastly, it is interesting to note that a significant volume of Ktunaxa Nation records was provided in electronic form. The nine Supreme Court Justices, with assistance from each of their three Clerks, referenced the responsive records entirely in paper during the hearing. From a technological perspective, the answer to the “Are we there yet?” question as it pertains to the Canadian legal system is clearly “no.”

 

Conclusion

Through this case study, the Canadian court system has demonstrated that it is an inappropriate mechanism for resolving First Nations issues that are central to Aboriginal identity and beliefs. In the spirit of reconciliation, it is sincerely hoped that Justices will become more willing to acknowledge and define inherent Aboriginal rights within the scope of s. 2(a) of the Charter going forward into the future. Without the willingness to acknowledge different perspectives of truth, reconciliation cannot happen.

At the time of writing this article, the Supreme Court of Canada released a decision regarding the Proponent’s case against British Columbia. To add insult to injury to the Ktunaxa Nation, the Justices confirmed that the former Minister of Environment’s 2015 decision to not renew Jumbo Glacier Resort’s Environmental Assessment Certificate was “unfair.” The current Minister must now review the evidence associated with Minister Polak’s decision, which happened to be the only provincial decision that was sympathetic to Ktunaxa spiritual values and rights.

Kathryn Teneese expressed the Ktunaxa Nation’s disappointment in response to the Supreme Court decision as follows:

The irony of the situation is not lost on us…In the past, the courts went to great lengths to try to justify the Province’s failure to even consider the Ktunaxa right to freedom of religion under the Charter of Rights and Freedoms regarding this ski resort. For the developer, however, the court was more than willing to cancel the Province’s decision because something wasn’t apparently adequately addressed in the Province’s decision.21

Although the legal phase of the Ktunaxa journey to protect Qat’muk may be over, the Ktunaxa Nation will remain firmly opposed to any permanent development in the Jumbo Valley area. Like other Aboriginal people in Canada, the Ktunaxa have extensive oral traditions. Language and culture is deeply rooted in the land. The critical work that is being done at the Ktunaxa Nation Council will continue, including calling upon the Crown to act in the spirit of the Truth and Reconciliation Commission’s “94 Calls to Action”, finalizing the Qat’muk Stewardship Plan, and digitally capturing traditional knowledge as important information assets of enduring value to the Ktunaxa Nation.

 

Appendix: Chronology of Key Events

The following chronology summarizes the key events surrounding the proposed Jumbo Glacier Resort project in the Jumbo Valley. The source of the chronology from March 1991 to March 2012 is Ktunaxa Nation v. British Columbia (Forests, Lands and Natural Resource Operations), Application for Leave to Appeal to the Supreme Court of Canada, which was accepted by the SCC on October 1, 2015.



Works Cited

British Columbia. BC Gov News. Jumbo Becomes Mountain Resort Municipality. 20 November 2012. Web. 23 August 2018.
British Columbia. BC Gov News. Jumbo Glacier Resort Project Not Substantially Started. June 18, 2015. Web. 23 August 2018.
British Columbia. Minister of Environment. Reasons for Ministers Determination In the Matter of a Substantially Started Determination under Section 18(5) of the Act for the Jumbo Glacier Resort Project of Glacier Resorts Ltd. 18 June 2015.
Delgamuukw v. British Columbia. [1997] 3 S.C.R. 1010. Supreme Court of Canada. Web. 23 August 2018.
Glacier Resorts Ltd. v. British Columbia (Minister of Environment), 2018 BCSC 1389.
Supreme Court of British Columbia. Web.  23 August 2018.
Ktunaxa Nation Council. Media Release: Ktunaxa Nation Disappointed with BC Supreme Court Ruling. [Cranbrook, B.C.] 21 August 2018.—–. Qat’muk Declaration. [Cranbrook, B.C.] 2010.
Ktunaxa Nation v. British Columbia (Forests, Lands and Natural Resource Operations).
Application for Leave to Appeal to the Supreme Court of Canada. 2015.
Ktunaxa Nation v. British Columbia (Forests, Lands and Natural Resource Operations).
Court of Appeal for British Columbia. 2015 BCCA 352.
Ktunaxa Nation v. British Columbia (Forests, Lands and Natural Resource Operations).
Supreme Court of British Columbia. 2014 BCSC 568.
Ktunaxa Nation v. British Columbia (Forests, Lands and Natural Resource Operations), 2017 SCC 54. Supreme Court of Canada. 2017.
Pheidias Project Management Corporation, Jumbo Glacier Alpine Resort: Project Proposal Outline. [Vancouver, B.C.] 1995.
Pylypchuk, Mary Ann. The Value of Aboriginal Records as Legal Evidence in Canada: An Examination of Sources. Archivaria 32, Summer 1991.
Skimap.org. Jumbo Mountain Resort: Trail System for the Resort. 2000. Web. 23 August 2018.
Teneese, Kathryn. Correspondence. [Cranbrook, BC] [Provincial Government] 18 September 2017—–Personal Interview. 4 April 2017—–. Personal Interview. 19 June 2018.
Tsilhqot’in Nation v. British Columbia. 2014 SCC 44, [2014] 2 S.C.R. 256. Supreme Court of Canada. Web. 17 August 2018.
Williams, Karenna. Correspondence. [Vancouver, B.C.] [Kathryn Teneese] 2 November 2017.

 

 

1Driving time between Invermere and Creston is approximately 4 hours, with Cranbrook located roughly in the middle. Grasmere is approximately 1 hour from Cranbrook to the southeast.
2Pheidias Project Management Corporation, Jumbo Glacier Alpine Resort: Project Proposal Outline, 1995.
3Skimap.org, Jumbo Mountain Resort, 23 August 2018.
4The concrete foundation of the day lodge was the only construction completed prior to the provincial government’s Environmental Assessment Certificate deadline in 2014. See Minister of Environment, Reasons for Ministers Determination In the Matter of a Substantially Started Determination under Section 18(5) of the Act for the Jumbo Glacier Resort Project of Glacier Resorts Ltd 18 June 2015.
5Ibid., p. 5.
6Qat’muk Declaration, 15 November 2010.
7See Appendix.
8Kathryn Teneese, Personal Interview, 4 April 2017.
9Ibid.
10Ibid.
11See Delgamuukw v. British Columbia [1997] 3 S.C.R. 1010.
12See Tsilhqot’in Nation v. British Columbia, 2014 SCC 44, [2014] 2 S.C.R. 256.
13Mary Ann Pylypchuk, Value of Aboriginal Records, Archivaria 32 (Summer 1991), p.66.
14Ibid., p. 52.
15Kathryn Teneese, correspondence 18 September 2017.
16Ktunaxa Nation v. British Columbia (Minister of Forests, Lands and Natural Resource Operations), Supreme Court Judgement [58].
17Ibid, [70]
18Ibid, [71]
19Ktunaxa Nation Council, Media Release, 2 November 2017.
20Karenna Williams, correspondence to Kathryn Teneese, 2 November 2017.
21Ktunaxa Nation Council, Media Release, 21 August 2018.
22BC Gov News, Jumbo Becomes Mountain Resort Municipality, 20 November 2012.
23Ktunaxa Nation v. British Columbia (Forests, Lands and Natural Resource Operations), 2014 BCSC 568. 24 Ktunaxa Nation v. British Columbia (Forests, Lands and Natural Resource Operations), 2015 BCCA 352. 25 BC Gov News, Jumbo Glacier Resort Project Not Substantially Started, 18 June 2015.
26Ktunaxa Nation v. British Columbia (Forests, Lands and Natural Resource Operations), 2015 BCCA 352.
27Ktunaxa Nation v. British Columbia (Forests, Lands and Natural Resource Operations), 2017 SCC 54.
28Ibid.
29Glacier Resorts Ltd. v. British Columbia (Minister of Environment), 2018 BCSC 1389

Politique relative aux messages électroniques et son adoption Par Barb Bellamy, CRM et Anne Rathbone, CRM.

 

Introduction

Les politiques établissent sur papier les comportements et les objectifs attendus des employés d’une organisation. Une politique ou une directive n’est que le début de la démarche. Pour être efficace, une politique nécessite des standards, des règlements et des procédures la supportant, afin d’appliquer les exigences nécessaires. La politique relative à la gestion des messages électroniques est une des plus difficiles à mettre en œuvre et appliquer. Le défi réside en la perception que chaque individu auquel a été assigné un compte de messagerie électronique en possède le contenu. Afin de vraiment convaincre chaque personne, une politique claire et concise doit être créée et appuyée par l’équipe de gestion. Ce qui n’est pas surprenant. Ce qui peut être surprenant par contre, est comment appliquer une politique qui a de telles implications personnelles sur les individus.

La mise en œuvre d’une politique relative aux messages électroniques requiert une approche personnalisée; avec de l’information qui est perçue comme étant personnelle, une approche personnalisée est nécessaire. L’utilisation de slogans, d’images et de ludification n’est qu’un exemple de personnalisation de la formation et de la mise en œuvre d’une politique relative aux messages électroniques. Mais la mise en œuvre ne se termine pas avec la formation. Il faut aussi faire le suivi pour s’assurer de la compréhension et du respect de la politique par chaque employé. Cette étape est critique afin de pouvoir démontrer le respect de la politique relative aux messages électroniques, en cas de besoin. La surveillance et les mises à jour sont également des étapes cruciales dans le développement et la mise en œuvre d’une politique pour chaque organisation, mais elles ne seront pas décrites dans le présent article.

 

Exigences des politiques relatives aux messages électroniques

Par où commencer lors de la création d’une politique relative aux messages électroniques? La publication (doc) d’ARMA International intitulée « Mise en œuvre de la politique relative aux messages électroniques »1  est  une  excellente  source  d’informations,  et  l’annexe  A  offre  20 suggestions de dispositions de politique relatives aux messages électroniques. Mais ce ne sont pas toutes ces dispositions qui devraient être incluses dans une politique; celles qui seront choisies dépendront de l’organisation.

Exemples de dispositions suggérées :

  1. Des objectifs et une portée clairement définie, de même pour les rôles et responsabilités, dans le but d’approuver, de mettre en œuvre et de réviser le processus de la politique.
  2. Un énoncé établissant que les messages électroniques appartiennent à l’organisation, peuvent être considérés comme des documents légaux et sont donc sujets aux politiques et procédures réglementant les archives et l’information.
  3. Une liste de toute autre politique, procédure ou directive concernant les messages électroniques, ainsi que toute loi ou règle applicable à la gestion des archives de l’organisation.
  4. Le droit de l’organisation à effectuer une surveillance et à réglementer l’utilisation et la transmission de messages électroniques, ainsi qu’une clarification des droits à la vie privée que l’employé pourrait
  5. Une explication de la façon dont les messages électroniques seront utilisés en cas de différend.

Le développement de la politique requerra la participation de plusieurs parties, incluant les services juridiques, des ressources humaines, de la gestion de l’information, de la technologie de l’information et toutes les unités opérationnelles. Donc, il est recommandé d’affecter un comité ou une équipe à la politique relative aux messages électroniques. La haute direction doit approuver la politique et s’assurer que l’importance du respect de la politique est communiquée à tout le personnel.

 

Exigences d’archivage

Une politique relative aux messages électroniques doit inclure les exigences quant à la sauvegarde; si votre organisation a une politique de sauvegarde ou d’élimination, il faudra inclure une référence à cette directive, politique ou à ce standard. Les exigences d’archivage sont déterminées et maintenues par le service de gestion de l’information et des archives, et doivent respecter toute exigence légale ou professionnelle, ainsi que tout règlement du secteur ou du gouvernement. La politique doit s’appliquer à tout message électronique envoyé ou reçu par l’organisation et indiquer quels messages électroniques doivent être conservés, la durée de leur archivage et dans quels emplacements ils doivent être conservés.

Un des aspects les plus importants de l’archivage est l’automatisation — les messages électroniques devraient être retirés du système de façon régulière et sans intervention manuelle. Cette automatisation élimine le risque d’erreur humaine, ce qui diminue significativement la responsabilité de l’entreprise. L’automatisation doit inclure des mesures de contrôle installées afin de tenir compte de tout différend en attente, ou archivages de l’information avant l’élimination de quelque message électronique que ce soit. Une méthode d’archivage des messages électroniques peut vous aider à vous conformer à une politique d’archivage des messages électroniques automatisée.

Aux États-Unis, les exigences d’archivage des messages électroniques fait désormais partie intégrante des politiques relatives aux messages électroniques en 2006, lorsque les règles fédérales de procédures civiles (règles fédérales) ont été révisées afin d’inclure toute information électronique2, même si l’archivage des messages électroniques était déjà partie intégrante de plusieurs programmes de gestion de l’information. La règle 34 des règles fédérales établit que « tout document désigné ou toute information archivée électroniquement — y compris les textes, dessins, graphiques, tableaux, photographies, enregistrements sonores, images ou toute autre donnée ou compilation de données — enregistrée sur quelque média que ce soit, depuis lequel de l’information peut être obtenue soit directement ou, si nécessaire, suite à une traduction par la partie répondante en un format raisonnablement utilisable ». Les règles fédérales sont des règlements qui déterminent les procédures pour les poursuites civiles dans les tribunaux fédéraux des États-Unis. Elles ne font pas partie de la loi canadienne fédérale, provinciale ou territoriale, mais peuvent avoir le pouvoir de persuasion.

Le règlement général sur la protection des données (RGPD), qui a été approuvé en avril 2016 et est entré en vigueur en mai 2018, est un règlement de la loi de l’Union européenne (UE) sur la protection des données et de la vie privée pour tout individu dans l’UE et la région économique européenne. Le RGPD s’applique à toute organisation ayant une présence établie dans l’UE, qui offre des biens et services aux individus dans l’UE, ou qui effectue une surveillance du comportement des individus dans l’UE. Le RGPD a également exercé une pression sur les organisations canadiennes faisant affaire et échangeant de l’information avec l’UE; les messages électroniques jouent un rôle significatif lors de ces échanges.

Le RGPD stipule que « Les données personnelles peuvent seulement être conservation légales et traitées le temps nécessaire à la réalisation d’un but précis. Ceci nécessite une estimation prudente du délai adéquat de conservation des messages électroniques archivés, renforçant ainsi la nécessité d’une politique d’archivage des messages électroniques qui est concrète et complète »3.

 

Échantillon de Politique relative aux messages électroniques

Il est souvent utile de consulter les politiques relatives aux messages électroniques des autres organisations afin de voir s’il y a des éléments qui pourraient être inclus dans votre propre politique. Vous trouverez ici-bas un échantillon de dispositions d’une politique relative aux messages électroniques provenant d’un organisme public.

Politique

La messagerie électronique (messages électroniques) est l’une des façons les plus utilisées de communication entre les employés de [Organisme public], et entre le [Organisme public] et le public. Les messages électroniques, comme toute autre archive créée et reçue par le [Organisme public], sont des documents officiels.

La politique sur la gestion de la messagerie électronique fournit une direction générale au sujet de la possession, l’organisation, l’archivage et la protection des messages électroniques conservés dans les comptes de messagerie électronique de [Organisme public].

But

Le but de la politique sur la gestion des messages électroniques est d’assurer que les messages électroniques ayant une valeur durable pour le [Organisme public] demeurent disponibles afin de satisfaire toute exigence légale, commerciale ou en matière de responsabilisation, et d’assurer la suppression régulière de messages électroniques n’ayant qu’une valeur transitoire.

Général

Le [Organisme public] se réserve le droit, sans le consentement de l’utilisateur, de surveiller, examiner, copier, archiver, envoyer et révéler le contenu des messages électroniques, spécialement en rapport avec toute enquête, procédure légale, inconduite professionnelle ou demande sous la Loi sur l’accès à l’information et la protection de la vie privée.

Les employés doivent s’assurer que la réponse automatique d’avis d’absence est en fonction et qu’elle contienne le nom de la personne à contacter en cas d’absence prolongée.

Les règles usuelles de courtoisie doivent être respectées lors de l’utilisation du système de messagerie électronique du [Organisme public], selon les Directives et politique du réseau de technologie de l’information.

Les comptes de messagerie électronique d’employés ne travaillant plus auprès de [Organisme public] seront supprimés du système six (6) semaines après le dernier jour de travail dudit employé. Il est de la responsabilité du supérieur de l’employé qui quitte de s’assurer que les messages électroniques ayant une valeur durable demeurent disponibles et conservés dans les archives du système de gestion des archives et des documents électroniques.

Propriété

Tout compte de messagerie électronique est la propriété de [Organisme public] et est fourni à l’employé afin de faciliter la conduite des affaires de [Organisme public].

Les comptes de messagerie électronique peuvent être utilisés à des fins personnelles durant les périodes de temps non rémunérées, selon les Directives et politique du réseau de technologie de l’information.

Les employés n’ont aucun droit personnel ou exclusif sur les messages électroniques et les pièces jointes contenues dans les comptes de messagerie électronique de [Organisme public].

Conserver et archiver

Les messages électroniques de valeur durable pour le [Organisme public] seront :

• archivés selon le calendrier d’archivage de l’organisme;

• disponibles, suivant les restrictions de sécurité, et conservés aux archives de [Organisme public];

• ne seront pas conservés dans la boîte de réception ou sur un disque dur personnel de l’employé.

• une date d’échéance de deux (2) ans sera imposé à tout élément de compte de messagerie électronique. Les messages électroniques datant de plus de deux ans seront automatiquement supprimés.

Les messages électroniques envoyés à l’interne de [Organisme public] seront archivés par l’émetteur. Ceci inclut les messages envoyés et les réponses.

Les messages électroniques reçus de sources externes seront archivés par le destinataire. Si plus d’un employé est le destinataire, le premier employé sur la liste de destinataires conservera le message.

Protéger et sécuriser

L’accès au système de messagerie électronique de [Organisme public] est réservé aux employés et aux membres du conseil d’administration ayant reçu un compte de messagerie électronique.

Toute information confidentielle d’une tierce partie reçue par l’entremise du système de messagerie électronique du [Organisme public] devra être protégée de divulgation, d’utilisation ou de changement non autorisé ou accidentel.

Les signatures personnelles numérisées ne devront en aucun cas être utilisées dans les messages électroniques. Cette mesure protège contre leur utilisation de façon frauduleuse ou inappropriée.

Éliminer

Les messages électroniques transitoires seront gardés pour le temps nécessaire afin de remplir les exigences légales, commerciales ou en matière de responsabilisation.

 

La politique citée est un bon exemple de politique relative aux messages électroniques comme point de départ. Elle fournit une direction claire sur ce à quoi la politique se rapporte; les documents créés et reçus par le [Organisme public] sont des documents officiels. Le but est clair; les messages ayant une valeur durable restent disponibles afin de répondre aux exigences légales, commerciales ou en matière de responsabilisation, et d’assurer l’élimination régulière des messages électroniques. La politique omet de préciser l’information nécessaire aux employés afin de déterminer ce qui constitue un document officiel et ce qui constitue de l’information transitoire.

La section sur la propriété n’indique pas clairement « qui » ou « quoi » est le propriétaire du compte de messagerie informatique ou de son contenu. En cas de découverte, de qui est-ce la responsabilité de geler le(s) compte(s); l’employé, le supérieur ou tout autre fonctionnaire de l’organisation? Normalement, cette décision devrait être prise dans le cadre des activités générales de mise en suspens pour des raisons juridiques et être définie dans une politique de mise en suspens pour des raisons juridiques et ses documents de processus.

La section d’archivage et de conservation légale est très claire à propos des règles d’archivage et de suppression dans le cadre de la gestion des messages électroniques. « Une date d’échéance de deux (2) ans sera imposée sur tout élément de compte de messagerie électronique. Les messages électroniques datant de plus de deux ans seront automatiquement supprimés. » est un concept très intéressant, rarement mis en œuvre par les organisations. Sans propriétaire établi, qui est-ce qui applique cette politique?

Le gouvernement du Canada a un cadre de politique très utile qui regroupe tous les directives, politiques et standards en une vision unique4. Ceci permet aux employés d’avoir une compréhension globale du traitement de l’information en tant qu’atout. La gestion des messages électroniques n’est qu’un des standards utilisés lors de la gestion des archives et de l’information critique au sein du gouvernement du Canada5.

 

Formation et engagement des employés

La première étape de tout programme de formation est d’obtenir la compréhension et l’engagement des employés. Pour ce qui est de la gestion des messages électroniques, la formation est une étape critique à la mise en œuvre. Il est difficile de faire accepter aux employés la notion que les messages électroniques puissent être considérés comme étant des documents officiels et qui appartiennent à l’organisation, et non à l’individu qui les a créés ou reçus. Simplement leur fournir la politique et leur demander de la lire n’est pas suffisant. Une formation devrait permettre aux employés de comprendre la politique elle-même et de sa nécessité, ainsi que les connaissances pratiques pour l’utilisation de la technologie, des outils et d’autres ressources utiles afin de se conformer à la politique. Malgré que les périodes d’orientation ou de processus d’intégration soient des moments appropriés à la présentation de la politique aux employés, la formation continue peut aider à inculquer le besoin de conformité ainsi que rafraîchir la mémoire aux autres employés sur tous les aspects de la politique.

Inclure quelques activités amusantes dans la formation peut augmenter l’engagement des employés et pallier le fait que les employés progressent à des vitesses différentes.

Quelques idées pouvant être incluses dans la formation :

 

Garder ou éliminer :

Donner à chaque employé entre cinq et dix messages électroniques — un mélange de messages transitoires et archives. Les employés doivent les classer selon ce qu’ils veulent garder ou ce qu’ils veulent éliminer, dans un laps de temps donné. Ensuite, discuter des décisions de l’employé.

 

A, B ou C :

Donner à chaque employé trois cartes marquées A, B et C. Après avoir passé en revue la politique relative aux messages électroniques, poser des questions à choix multiples au sujet des dispositions incluses dans la politique. Remettre un prix à l’employé ayant eu le plus de réponses exactes. Par exemple :

À qui appartiennent les messages électroniques reçus dans le cadre de votre travail?

A. Vous — c’est votre nom sur le message électronique.

B. Votre organisation

C. Votre superviseur

 

Mots croisés ou mots cachés :

Créer un mot croisé en utilisant des termes de la politique. Le remettre aux employés pour qu’ils le terminent dans un laps de temps donné. Remettre un prix à l’employé ayant terminé le mot croisé en premier.

 

Jeopardy :

En utilisant des catégories en rapport avec la politique relative aux messages électroniques, les responsabilités des employés et la détermination de quels messages sont considérés des archives, les employés doivent fournir les questions aux réponses données. Par exemple :

Dans la catégorie « Archive ou transitoire », afficher la réponse « Un rendez-vous 2 semaines plus tôt » devrait entraîner la question « Qu’est-ce qu’un document transitoire? »

 

Thèmes :

Terminer la formation avec différents thèmes. Par exemple :

Avril est désigné comme étant le mois des archives et de l’Information chez ARMA. Lier l’évènement avec une activité de ménage de printemps des comptes de messagerie électronique des employés.

 

Sudoku:

Utiliser le mot « Complying » (se conformer) afin de créer un jeu de Sudoku6 — cette idée est empruntée au Centre des sciences marines et de la Côte Pacifique — faisant partie du sondage géologique américain.

 

Êtes-vous conforme à notre politique relative aux messages électroniques? Essayez de résoudre le jeu de Sudoku ici-bas. Chaque grand carré de 9 petits carrés doit contenir le mot

« COMPLYING ». Il en est de même avec chaque ligne et chaque colonne. Pouvez-vous compléter le puzzle afin que chaque grand carré ait toutes les lettres de « COMPLYING » sans répéter la même lettre, pour chaque grand carré, ligne ou colonne?

 

Création d’image de marque

BusinessDictionary.com définit la création d’image de marque (branding) comme étant « Le processus impliqué en créant un nom et une image unique pour un produit dans l’esprit du consommateur, principalement grâce à des campagnes publicitaires ayant un thème constant. La création d’image de marque a pour but d’établir une présence significative et différenciée sur le marché, qui attire et garde une clientèle loyale. »7

La création de votre marque peut être une autre méthode pour inciter la participation de vos employés et de susciter de l’enthousiasme envers la mise en œuvre de la politique. Demandez aux employés de soumettre des idées et remettez un prix à la personne avec la proposition gagnante.

Quelques exemples de création d’image de marque pour une politique sont :

  • The biggest loser (Le plus grand perdant)
  • Please be a tosser (S’il-vous-plaît soyez jeteur)
  • Control your records before they control you (contrôlez vos archives avant qu’elles ne vous contrôlent)
  • To keep or not to keep … that is the question (garder ou ne pas garder… telle est la question)
  • Know when to hold’em, know when to throw’em (sachez quand les garder, sachez quand les jeter)
  • If you purge it, they will come (si tu les éliminent, ils viendront)
  • RM is the only profession that knows in advance what it is going to forget (La “RM” est la seule profession qui sait en avance ce qu’elle va oublier)
  • Keep the Corporate memory — forget the clutter (garder la mémoire corporative — oubliez le désordre)
  • Email is a communication tool, not a filing cabinet! (Les messages électroniques sont un outil de communication, et non pas un classeur à dossiers!)

Utilisez l’image de marque gagnante comme outil de marketing afin de promouvoir la politique. Par exemple, le thème « Know when to hold’em, know when to throw’em » pourrait avoir une image de bateaux de jeux sur le Mississippi ou du Far West. Un parieur ou un cowboy pourrait en être la mascotte. La chanson « The Gambler » de Kenny Rogers pourrait jouer lors de l’entrée des employés pour la formation. Les jeux suggérés dans cet article pourraient être renommés pour faire partie du thème.

Certaines organisations ont même créé des vidéos amusants utilisant leur marque afin de publiciser leur politique relative aux messages électroniques et d’expliquer aux employés leur responsabilité à ce sujet. Imaginez le plaisir à avoir en utilisant le thème « Know when to hold’em » lors de la création d’un vidéo.

Malgré que la création d’images de marque n’ait pas encore été utilisée pour mettre en œuvre une politique relative aux messages électroniques, le Sunshine Coast Regional District a utilisé cette technique pour la mise en œuvre de leur programme de gestion des documents électroniques et des archives, Dr Know. Ils ont utilisé un savant fou comme icône. La formation est faite dans le laboratoire de Dr Know, les employés ayant complété avec succès leur formation sont appelés des « Je-sais-tout », et ceux qui ont reçu une formation spécialisée sont appelés les « spécialistes ». La formation avait aussi un thème — la formation d’introduction était appelée « Apprendre à connaître Dr Know » et la formation avancée « Obtenir vos résultats de la part de Dr  Know ». Les jours fériés  ont également  servi dans le  cadre de la formation, par exemple  la Saint-Valentin (Bee My Record), Pâques (les collections de Pâques de Dr Know), la rentrée scolaire (retour à l’école, retour à la base) et les vacances estivales (MOJITO – Managing Our Jobs using Information To Our Advantage). La création d’images de marque et le marketing peuvent être très efficaces — il suffit d’un peu d’imagination.

 

Sommaire

Les politiques sont des règles ou directives créées lors d’un processus d’affaires spécifique. Elles sont des instructions formelles applicables à l’ensemble de l’organisation qui doivent être respectées par chaque individu dans l’organisation. Le non-respect de ces politiques peut entraîner des mesures disciplinaires. Il est essentiel à la bonne mise en œuvre de toute nouvelle politique d’avoir l’approbation de la haute direction.

Les politiques sont le bon mécanisme afin de faire parvenir votre organisation à sa destination; s’assurer que l’information est traitée comme un atout, et archivée ou éliminée selon les règles que vous déterminez. Il y a une courbe d’apprentissage qui peut ne pas sembler naturelle à vos employés au départ, lors du déploiement d’une politique. La mise au point d’une politique claire et concise, qui explique qui, quoi, où, quand et pourquoi, est le point de départ. L’ajout de directives, de standards, de consignes et de procédures aidera à assurer la compréhension de la politique et son respect. Organiser des sessions de formation et créer une marque qui est amusante et stimulante sera attirant pour la plupart des employés et les incitera à respecter la politique. Le but ultime des politiques est d’avoir un bassin d’employés qui comprend et respecte les politiques de l’information déterminées par l’organisme, incluant les messages électroniques.

Il n’est pas suffisant pour les organisations de mettre en place la politique la mieux écrite. Elle doit également être capable d’attester de la formation qu’elles offrent, ainsi que démontrer, par les signatures et les performances des employés, qu’ils ont été formés et comprennent la politique et les conséquences de son non-respect. Ceci peut être accompli en signant une déclaration de complétion ou en réussissant un examen portant sur la gestion des messages électroniques. Cette attestation et certification est la preuve que les employés comprennent et respectent la politique relative aux messages électroniques; ces archives peuvent être utilisées lors de procès ou avec des régulateurs prouvant que l’organisation a fait son travail de façon assidue.

Les politiques doivent être révisées annuellement et, si nécessaire, actualisées ou réécrites. Si la politique est significativement modifiée, les employés doivent être avertis du changement et de la façon dont cela affectera leur rendement. Avec une formation appropriée et de la motivation, les employés utiliseront la politique relative aux messages électroniques et les documents qui y sont associés pour gérer leur compte de messagerie électronique comme prévu, créant ainsi une organisation plus forte et plus conforme.

 

Works Cited

1ARMA International, Implementing Electronic Messaging Policies TR31-2018 (Overland Park: ARMA International, 2018).
2https://www.cornell.edu/rules/frcp
3https://www.intradyn.com/email-retention-laws/
4https://www.tbs-sct.gc.ca/pol/doc-eng.aspx?id=12742
5https://www.tbs-sct.gc.ca/pol/doc-eng.aspx?id=27600
6US Geological Survey. Pacific Coastal and Marine Science Centre. walrus.wr.usgs.gov/infobank/programs/html/training/recordsmanagement/rmhumor.htlm
7http://www.businessdictionary.com/definition/branding.html

Email Policy and Adoption

by Barb Bellamy, CRM and Anne Rathbone, CRM.

 

Introduction

Policies establish in writing the expected behaviors and outcomes of an organization’s employees. A policy or directive is only the beginning of the journey. To be effective, a policy requires supporting standards, regulations and procedures to apply the necessary requirements. An email management policy is one of the toughest policies to implement and enforce. The challenge lies in the perception by each individual that has been assigned an email account that they own the contents. To truly get buy in from each individual, a clear and concise policy must be created and supported from the organization’s top management. This is not a surprise to anyone. What may be a surprise is how to implement a policy that has such intimate implications to the individual.

Email policy implementation requires a personalized approach; with perceived personal information, a personal approach is critical. Using catch phrases, branding and gamification are some examples of personalizing the training and implementation of an email policy. Your implementation does not end with the training. There is also the need to track that each employee understands and will follow the email policy. This step is imperative to prove compliance with the email policy should it ever be needed. Monitoring and refreshing are also critical steps in the policy development and implementation for each organization but will not be covered in this article.

 

Email policy requirements

Where to start when creating an email policy? ARMA International’s publication, “Implementing Electronic Messaging Policies1 is an excellent resource and Appendix A provides 20 suggested provisions of an electronic messaging policy. Not all of the provisions would be included in an email policy. The provisions incorporated into a policy would depend on the organization.

Included in the suggested provisions are:

  1. A clearly defined scope and objectives, as well as the roles and responsibilities for approval, implementation and a review process of the
  2. A statement that emails belong to the organization, could be records and are therefore subject to the organization’s records and information policies and
  3. A list of any other organizational policies, procedures, or guidelines that address email as well as any applicable legislation or regulations that apply to management of the organization’s
  4. The organization’s right to monitor and regulate the use and transmission of email and clarifying what privacy rights the worker may
  5. A statement of how email will be managed during

Developing the policy will require input from a variety of stakeholders including legal, human resources, information management, information technology and all business units. Therefore, striking an email policy committee or team is recommended. Senior level management must approve the policy and ensure that the importance of complying with the policy is communicated to all staff.

Retention Requirements

An email policy must include retention requirements; if your organization has a Retention or Disposition policy include a reference to that directive, policy, or standard. Retention requirements are set and maintained by the Information / Records Management department and should comply with legal, business requirements, industry and government regulations. The policy must cover all emails sent or received by the organization and contain the guidelines for which emails should be kept, how long they are to be retained and where they are to be stored.

One of the most important aspects of the email retention is automation – emails should be removed from the system in a consistent manner without any manual intervention. This automation eliminates human error and decreases corporate liability significantly. The automation must have controls in place to account for any pending litigation or holds on the information prior to deleting any emails. An email archiving solution can help you to comply with an automating email retention policy.

In the United States, email retention requirements became part of email policy in 2006 when the Federal Rules of Civil Procedure (Federal Rules) were revised to include all electronic information2, although email retention was already part of many information management programs. Rule 34 of the Federal Rules states that “any designated documents or electronically stored information—including writings, drawings, graphs, charts, photographs, sound recordings, images, and other data or data compilations—stored in any medium from which information can be obtained either directly or, if necessary, after translation by the responding party into a reasonably usable form”. The Federal Rules are regulations that set out procedures for civil law suits within United States federal courts. The Federal Rules do not form part of Canadian, federal, provincial or territorial law but may have persuasive value.

The General Data Protection Regulation (GDPR) which was approved in April of 2016, and came into force in May 2018, is a regulation in the European Union (EU) law on data protection and privacy for all individuals within the EU and the European Economic Area. The GDPR applies to organizations with an established presence in the EU, which offer goods and services to individuals in the EU, or monitor the behavior of individuals in the EU. The GDPR has also put a burden on Canadian organizations that do business and exchange information with the EU; email plays a significant role in that exchange.

The GDPR stipulates that “Personal data can only be held and processed for as long as is necessary for a specific purpose. This necessitates careful consideration of how long archived emails need to be kept, further emphasizing the need for a concrete, thorough email retention policy.”3

 

Sample Email Policy

It is often useful to review other organizations’ email policies to see if there are elements that can be included in your policy. Below is a sample of a public body’s email policy.

Policy

Electronic mail (email) is one of the most extensively used forms of communication between [Public Body] employees and between the [Public Body] and the public. Email messages, like other records created and received by the [Public Body], are official records. The Policy on Electronic Mail Management provides direction on the ownership, organization, storage and protection of email messages stored within the [Public Body]’s email accounts.

Purpose

The purpose of the Policy on Electronic Mail Management is to ensure that email messages of enduring value to the [Public Body] remain accessible to meet legal, business and accountability requirements, and to ensure the regular disposal of email messages having only transitory value.

General

The [Public Body] reserves the right, without the consent of the user, to monitor, examine, copy, store, forward and disclose the contents of email messages, especially in relation to investigations, legal proceedings, professional misconduct and requests under the Freedom of Information and Protection of Privacy Act. Employees will ensure that their email “out of office” is turned on advising whom to contact during prolonged periods of absence. Common rules of etiquette will be followed when using the [Public Body] email system, per the Information Technology Network Policy and Guidelines. Email accounts of employees that are no longer employed at the [Public Body] will be removed from the system six (6) weeks after the employee’s last day of employment. It is the responsibility of the departing employee’s manager to ensure that email of enduring value remains accessible and stored in the Electronic Document and Records Management System (EDRMS) repository.

Ownership

All email accounts are the property of the [Public Body] and are provided to employees to facilitate the conduct of [Public Body] business. Email accounts may be used for personal use on personal time, per the Information Technology Network Policy and Guidelines. Employees do not have any personal or proprietary rights over email messages and attachments contained within [Public Body] email accounts. Upon request from the manager, custodial rights to email messages within an account of a former employee will be transferred to a current employee, usually the immediate supervisor.

Store and Retain

Email messages of enduring value to the [Public Body] will:

• Be retained following the [Public Body]’s retention schedule
• Remain accessible, based on security constraints, and will be stored in the EDRMS
• Will not be stored in an employee’s inbox or on a personal drive
• A date limit of two (2) years will be imposed on all email accounts. Emails older than two years will automatically be deleted.

Email messages sent within the [Public Body] will be saved by the sender. This includes the outgoing message and the responses. Email messages received from external sources will be saved by the recipient. If more than one employee is a recipient, the first employee on the recipient list will save the email message.

Protect and Secure

Access to the [Public Body]’s email system is limited to employees and Board members who have been assigned an email account. Sensitive third-party information received via the [Public Body]’s email system will be protected from unauthorized or accidental disclosure, use or alteration. Scanned personal signatures will not be included within email messages under any circumstances. This is to prevent their use in a fraudulent or inappropriate manner.

Dispose

Email messages that are transitory will be retained only as long as needed to meet legal, business and accountability requirements.

The policy above is a good example of an email policy as a starting point. It provides clear direction on what the policy pertains to; records created and received by the public body are official records. The purpose is clear; messages of enduring value remain accessible to meet legal, business and accountability requirements, and to ensure the regular disposal of email messages. The policy does lack the information required for employees to determine what is an official record and what is transitory information.

The ownership section does not clearly spell out “whom” or “what” is the owner of the email account and its contents. In the case of discovery, who is responsible for placing a hold on the email account (s), is it the responsibility of the employee, manager or someone other officer of the organization? Normally this decision would be made as part of the overall Legal Hold activities and be defined in a Legal Hold policy and process document.

The store and retain section is very clear regarding retention and disposition rules surrounding the management of emails. “A date limit of two (2) years will be imposed on all email accounts. Emails older than two years will automatically be deleted” is a very interesting concept that is not implemented by organizations often. Without a clear owner, who enforces this policy?

The Government of Canada has a very useful policy framework that ties all related directives, policies and standards into a single view4. This allows employees to get a holistic understanding of Information being treated as an asset. Email management is just one of the standards use to manage records and critical information at the Government of Canada.5

 

Training and Employee Buy-in

The first step in all training programs is to get the understanding and support of the employees. With email management, training is critical to the implementation. Having employees accept the concept that email can be a record and belongs to the organization, not to the individual who created or received it, is difficult. Simply providing them with the policy and asking them to read it is not adequate. Training should provide employees with an understanding of the policy itself, why it is necessary and how to use technology, tools and other resources in order to be compliant with the policy. While orientation or during the onboarding process are good times to introduce employees to the policy, ongoing training can help ingrain the need for compliance as well as providing a refresher to employees on all aspects of the policy.

Including some fun activities in the training can increase employee buy-in and may help to address the fact that employees have different learning needs.

Some ideas to include in the training are:

Keep or Toss:

Give employees five to ten emails – a mixture of transitory emails and records. The employees are to sort them into keep or toss piles, within a given time limit. Then discuss the employee decisions.

A, B or C:

Give each employee three cards marked A, B, and C. After reviewing the email policy, ask multiple choice questions about the provisions included in the policy. Have a prize for the employee with the most correct answers. For example:

Who owns emails received as part of your work duties?

A. You – it’s your name on the email.
B. Your organization
C. Your supervisor

Crossword Puzzle or Word Search:

Using terms from the email policy, create a puzzle. Give the puzzle to the employees and have them solve it, within a given time frame. Have a prize for the employee who solves the puzzle first.

Jeopardy:

Using categories relating to the email policy, employee responsibilities and determining which emails are records, players provide questions to the answers displayed. For example:

Under the Category, “Record or Transitory”, displaying an answer of “A meeting appointment 2 weeks out of date” would elicit the correct response of, “What is a transitory document?”

Themes:

Tie in the training with various themes. For example;

April is designated as ARMA’s Records and Information Month. Tie that in with spring cleaning of employee email accounts.

Sudoku:

Use the word “Complying” to create a Sudoku puzzle6 – this idea is borrowed from Pacific Coastal and Marine Science Centre – part of the US Geological Survey.

Are you “COMPLYING” with our Email Policy? Try completing the Sudoku puzzle below. Each large square of 9 smaller squares will contain all the letters in the word “COMPLYING”. All rows and columns will do the same. Can you complete the puzzle so each large square has all the letters in “COMPLYING” without repeating the same letter within a large square, row or column?

 

 

Branding

BusinessDictionary.com defines branding as “The process involved in creating a unique name and image for a product in the consumers’ mind, mainly through advertising campaigns with a consistent theme. Branding aims to establish a significant and differentiated presence in the market that attracts and retains loyal customer.”7

Creating your brand can be another method to engage your employees and create enthusiasm for the policy implementation. Ask employees to submit ideas and award a prize to the winning submission.

Some examples of branding email policies are:

  • The biggest loser
  • Please be a tosser
  • Control your records before they control you
  • To keep or not to keep…that is the question
  • Know when to hold’em, know when to throw’em
  • If you purge it, they will come
  • RM is the only profession that knows in advance what it is going to forget
  • Keep the Corporate memory – forget the clutter
  • Email is a communication tool, not a filing cabinet!

Use the winning brand as a marketing tool to promote the policy. For example, the theme listed above “Know when to hold’em, know when to throw’em” could have an old riverboat gambler or old west slant to it. A riverboat gambler or cowboy could be used as a mascot. Kenny Rogers’ song “The Gambler” could be played when employees are arriving for training. The games suggested in this article could be renamed to incorporate the theme.

Some organizations have even created entertaining videos using their brand to publicize the email policy and to teach employees their responsibilities regarding the email policy. Imagine the fun that could be had using the “Know when to hold’em” theme when creating a video.

While branding has not yet been used for implementing an email policy, the Sunshine Coast Regional District did use this technique for their implementation of an electronic document and records management software, Dr. Know. They used a mad scientist as the icon. Training was done in Dr. Know’s laboratory, those who completed training were called “Know-it-all’s”, and those with in-depth training who could act as a resource within their departments were called “Specialists”. Training was also themed – the introductory training was called “Getting to Know Dr. Know” and advanced training was “Getting Your Results from Dr. Know”. Holidays have also been used for training, including Valentine’s Day (Bee My Record), Easter (Dr. Know’s Easter Collections), back to school (Back to School, Back to Basics), and summertime (MOJITO – Managing Our Jobs using Information To Our advantage).

Using branding and marketing tools can be very effective – all it takes is a little imagination.

 

Summary

Policies are rules or guidelines for a specific business process. They are formal organization- wide instructions that must be followed by everyone within the organization. Violation of policies may result in disciplinary action. It is critical to the implementation of new policies to have approval from top management.

Policies are the mechanisms to get your organization to its destination; ensuring that information is treated as an asset and retained or disposed of in accordance with your published rules. There is a learning curve that may not feel natural to employees at first while deploying an email policy. Creating a policy that is clear and concise, explains who, what, where, when and why is the starting point. Adding directives, standards, guidelines and procedures will help to ensure understanding of and compliance with the policy. Conducting training sessions and creating a brand that is fun and engaging will appeal to most employees and generate compliance to the policy. The ultimate goal of policies is to have a workforce which understands and follows the information policies set out by the organization including email.

It is not enough for organizations to put even the most well written policies in place. They must also be able to attest to the policy training they provided, as well as certify by employees’ signatures and performance that they have been trained and understand the policy and the consequences of violating it. This can be completed by signing a declaration of completion or passing a test on email management. The attestation and certification provides proof that employees understand and are to follow the email policy; these records can be used during legal actions or with regulators that the organization has done its due diligence.

Policies must be reviewed annually and, if necessary, refreshed or rewritten. If the policy is altered significantly, employees must be informed of the change and how it affects their performance. With proper training and incentives, employees will use the email policy and its supporting documents to manage their email accounts as expected, making for a stronger, more compliant organization.

 

Works Cited

1ARMA International, Implementing Electronic Messaging Policies TR31-2018 (Overland Park: ARMA International, 2018).
2https://www.cornell.edu/rules/frcp
3https://www.intradyn.com/email-retention-laws/
4https://www.tbs-sct.gc.ca/pol/doc-eng.aspx?id=12742
5https://www.tbs-sct.gc.ca/pol/doc-eng.aspx?id=27600
6US Geological Survey. Pacific Coastal and Marine Science Centre. walrus.wr.usgs.gov/infobank/programs/html/training/recordsmanagement/rmhumor.htlm
7http://www.businessdictionary.com/definition/branding.html

Thank you to John Bolton!

by Alexandra (Sandie) Bradley, CRM, FAI

 

John Bolton was a founding member of ARMA Canada’s Sagesse: Journal of Canadian Records and Information Management, Editorial Board, and a contributing author to the publication.

In early 2018, he retired from the Board, and his fellow members of the Board wish to thank him for his contributions and provide the following profile of his career and contributions to the records and information management community.

 

Biography

John was born in Oakland, California in 1950, and came to Canada with his family in 1954. He grew up in the Niagara Peninsula region of Ontario. He attended the University of Waterloo for his first degree in urban and regional geography, then attended Wilfred Laurier University to obtain an MA in geography.

Following graduation, he worked for the regional government of Niagara, creating maps for emergency management, and become the environmental lead in emergency response. In the fall of 1978, he moved west to Calgary, where he joined a division of Texas Instruments (TI), and worked in geotechnical seismic exploration. After a couple of years, he became a company troubleshooter, providing support and relief to company crews, travelling all over the world, working in both land and maritime environments.

After six years, he left TI, and was in the process of applying to the PhD program at the University of Alberta. However, conversations with a librarian let him to discover that he had been working with two sides of the triangle of information management: “the creation and gathering of information”, and “researching and using of information”. The librarian had suggested the missing piece was “the management of information”. To follow the management pathway, he could either go into the Information Technology (IT) field (just developing in the late ‘80s) or go to Library School. Knowing nothing about librarianship, he conducted his research the old fashioned way, using books and references, to discover that this profession could be a match for his interests. His future wife (Debbie) lived in Vancouver, so he applied and was accepted at the University of British Columbia, where he completed an MLIS in 1987. Among the courses he completed was a records management class, taught by Sandie Bradley. That introduced to John the “Life Cycle” concept, which supported his interest in the management of information and ultimately turned his career path toward records and information management.

 

RIM Career

When he graduated from UBC, he worked for Klohn Leonoff Engineering from 1987 to 1993 (and married Debbie!). During that time, he was manager of records and library services, and ultimately became the office manager before he left the firm. He obtained his CRM in 1995.

His RIM career continued when he joined the BC Government in 1994, as a records officer in the Ministry of Skills, Training and Labour. The British Columbia Freedom of Information and Protection of Privacy Act had been introduced in October, 1993, and he was immediately processing FOI requests. When asked by his manager to identify his career goals, John stated that he wanted his manager’s job. Two months later, his manager was gone, and John got his wish, becoming the FOI and Records Coordinator. With one assistant, he managed the flood of requests arising from the new legislation, plus the management of all of the ministry’s records.

A joint project between BC and the Government of Canada to manage the devolution of federal programs to the province led him to a sub-committee to develop an information sharing agreement. In 2001, he was head-hunted to join another Ministry, Human Resources, where he continued to review privacy issues, but also was doing more training and policy development. By 2004 he was spending so much time on new systems development, including the alignment of information privacy and security requirements to new computer technology that he was asked to join the IT department. The following year he joined a data warehousing project. As part of his training he completed a master’s level program in project management at the University of Victoria. From 2006 until his retirement in 2014, his title was Data Warehouse Architect, conducting business intelligence projects for the various ministry programs and working with the management of his ministry’s “big data”.

 

ARMA Involvement

John’s involvement with ARMA International started with he joined the Vancouver Chapter in 1988. He was immediately tagged the Membership Chair, and in in 1990-91, he was the treasurer of the first ARMA Canada conference held in Vancouver. He was also the CLARA representative. While an active member of the chapter, he also contributed a regular column “As I See It” to the chapter newsletter, offering his perspective on new trends or issues. He won the Chapter Member of the Year Award in 2000.

In 2003, he became a charter member of the Vancouver Island Chapter, when it was established as an independent chapter. His busy work life kept him on the sidelines of chapter activities for a number of years, although he was a speaker to students at UBC, and he presented a paper at the Mountain Pacific Region’s “Whining about Records” conference in Kelowna in 2010. In 2011-12, he was the co-chair for the ARMA Canada Conference hosted by the Vancouver and Vancouver Island chapters in Nanaimo.

Along the way, he got “goosed” in recognition for his service to the profession.

At the International level, he joined the Email Standards Task Force in 1997. From then until 2004 he served as a member of the Standards Development Committee, during the last year as Co-chair. During his time on the committee he was part of the review of both ISO 154891 and the US DoD 5015.02 standards2. Over his ARMA career John presented at numerous conferences and was a published author for ARMA.

 

Sagesse

Through his deep and abiding interest in professional research and scholarship, he was a founding member of the Editorial Board of Sagesse. He contributed not only to the structure and functions of the Board, but also helped to shape the early operations of the peer review process, and contributed his views on the quality and quantity of content. As a speaker and writer, he also presented his research on the decline of scholarly publishing by RIM professionals, and provided a joint workshop with fellow Board member Christine Ardern (Entitled “Getting Your Message Across in Writing”) offering encouragement for members to write for Sagesse. This presentation was provided at the ARMA Canada conference in Toronto, May 29-13, 2017.

Asked to describe his perspective on the RIM profession now that he is retired, he stated “Nothing is new, but issues always follow the swinging door of whatever is new”….He continues to see no new research or shifting of paradigms. In 2014 he addressed UBC students in a class where he put forth the idea that the records life cycle is a 500 year old paradigm now dead in view of the constant change within information technology. Hopefully there are researchers at work who can put forward the newer paradigm we need.

In recognition for his work in establishing Sagesse, John was awarded the ARMA Canada Region Distinguished Member Award in 2016.

The Sagesse Editorial Board, so grateful to John for sharing his perspectives and passion about information, and they also need to share additional thoughts:

Sandra Dunkin notes:

John has been an enthusiastic supporter of professionals new to RIM. Always available to give astute advice and guidance to those who need it. He has mentored CRM candidates with his unique perspective gleaned from both RIM practise and PMP training – invaluable tuition for anyone hoping to succeed on Part 6 of the CRM exam. He also offers endless encouragement to colleagues to progress not only in their career, but in the expansion of their professional development on broader topics that enhance RIM practise. John is one of those colleagues we all hope to encounter during our careers, his kindness and support have been of incalculable value to those who have had the great fortune to know him.

Stuart Rennie shares:

I first met John Bolton years ago at an ARMA Vancouver Chapter event where he was speaking. My first impression of John was that he was very intelligent, engaging and funny. Years later, we worked together as founding editorial board members at Sagesse, Journal of Canadian Records and Information Management. On the Sagesse board, John’s sage counsel was instrumental in getting us off the ground and publishing. We had many thought-provoking discussions about the RIM profession and publishing. For example, John taught me about concept of the “Hamburger” style of writing. Proper writing needs a top bun– a clear distinct opening; then filler–facts and evidence; finally, a bottom bun–a conclusion that wraps it all up. Tasty AND fun!

Uta Fox adds:

Being a member of Sagesses Editorial committee with John was a delight and education. While working through articles the Editorial committee were considering as publish-worthy he habitually referred to himself as the “curmudgeon” of the group. John’s exceptional “eye” immediately spotted misplaced commas that changed meaning, or inconsistent sentence structures or an argument that did not address the focus of the article. With his mastery of grammar, punctuation and sentence structure, he was also an expert on developing logical arguments whether it was addressing RIM theory or its practical aspects; he demonstrated the same zest for expression.

I urge readers to re-read John’s article in Sagesse’s first issue in 2016 entitled “A Content Analysis of Information Impact: Professionalism or Not – a Critical Twenty-Five Year Review” available on the ARMA Canada website: www.armacanada.org. You’ll see how he weaves the concept of RIM as a profession (is it or isn’t it?), with content analysis, the value of a theoretical foundation for RIM, improving Canadian authorship and addressing RIM from the Canadian perspective. His article is brilliant, the insights viable and valuable and so much food for thought provided (and not just the hamburger type!). Thank you, John for your participation and assistance in promoting the Sagesse publication. You set the bar high and we accept the challenge.

 

John Bolton’s contributions to the RIM profession have shaped Sagesse in focus and content. Thank you and well done!

 

Works Cited

1ISO/TR15489-1. Information and Documentation – Records Management – Part 1: General. First Edition, ISO, 2001. ISO/TR15489-2. Information and Documentation – Records Management – Part 2: Guidelines. ISO, 2001.
2United States Department of Defense. Electronic Records Management Software Applications: Design Criteria Standard. April 25, 2007. http://jitc.fhu.disa.mil/cgi/rma/standards.aspx.

Are We There Yet?

Introduction to the SharePoint & Collabware Implementation at the First Nations Summit

Sandra Dunkin, MLIS, CRM, IGP
Records & Information Management Coordinator First Nations Summit
Sam Hoff
ECM Consultant Gravity Union

 

Introduction1

The topic of SharePoint and its use for records management has been of perennial interest in the records and information management (RIM) community since Microsoft first decided to enter the market with SharePoint in 2001. Through the various iterations of the SharePoint platform, Microsoft has endeavoured to provide records management capabilities both out of the box, through customizations, and more recently through third party add on applications to manage the compliance and automation of records classification and disposition. Numerous case studies and educational sessions have been offered by Microsoft and related third party add-on vendors as well as subject matter experts, like Bruce Miller. Each year information organizations such as ARMA International, ARMA Canada, the various ARMA Chapters as well as many other North American and international information management professional organizations, continue to offer SharePoint sessions as part of their education outreach. SharePoint remains a hot topic in the RIM industry with increasing market share as third party developers continue to evolve their applications to manage sophisticated RIM tasks. Many organizations are keen to provide multipurpose solutions for their business needs rather than investing in specialised software solutions for siloed RIM activities, making SharePoint an attractive alternative to expensive, specialised and often modular RIM software applications.

This case-study will give an overview of the SharePoint implementation project at the First Nations Summit (FNS), underlining lessons learned, missteps, failures and eventual successes. From the inception of the project, the FNS has endeavoured to create a multi-purpose solution to address the evolving needs of staff, executive and contractors in the daily routine of their work as well as addressing RIM requirements for their large volume of records. The challenges of change management, user training and adoption, various upgrades and the changing environment as users engage more actively with SharePoint will also be covered.

 

A few words about the First Nations Summit

The First Nations Summit is comprised of a majority of First Nations and Tribal Councils in British Columbia and provides a forum for First Nations in BC to address issues related to Title and Rights and treaty negotiations as well as other issues of common concern.

In October 1990, leaders of BC First Nations met with the Prime Minister of Canada and then with the Premier and Cabinet of British Columbia urging the appointment of a tripartite task force to develop a process for modern treaty negotiations in BC. The federal and provincial governments agreed and on December 3rd, 1990, the BC Claims Task Force was established by agreement of the Government of Canada, the Government of British Columbia, and representative leadership of First Nations.

Leaders from First Nations across British Columbia appointed three members to the BC Claims Task Force at a meeting called the First Nations Summit. Two members were appointed by the Government of Canada, and two by the Province of British Columbia. Following more than five and half months of deliberations, the 1991 Report of the BC Claims Task Force recommended that First Nations, Canada and British Columbia establish a new relationship based on mutual trust, respect and understanding – through political negotiations.

 

FNS Role

The FNS is an action and solutions oriented First Nations-driven organization. The Summit’s original mandate is to advance discussions with the governments of Canada and BC to support First Nations in conducting their own direct treaty negotiations with Canada and BC. The foundation for our mandate arises from:

  • the tripartite 1991 BC Claims Task Force Report jointly developed by the First Nations, Canada and BC,
  • the 1992 agreement to create the BC Treaty Commission as the independent body to “facilitate” treaty negotiations, and
  • subsequent federal and provincial legislation and the First Nations Summit Chiefs resolutions implementing the 1992 agreement and establishing the BC Treaty Commission as a distinct legal entity.

Approximately 150 First Nations in BC participate in FNS assemblies to bring forward, discuss and provide political direction on issues of common concern.

In carrying out their mandate, the FNS does not participate as a negotiating party at any First Nations specific negotiations. Over time and through the collective decisions by First Nations Chiefs and leaders, as directed by resolutions, they have been instructed to take a leadership and advocacy role on the full range of issues of concern to First Nations, including negotiations and implementation issues of treaties, agreements and other constructive arrangements and day-to-day social and economic issues which affect First Nations.

A critical element of the Summit’s work includes identification of concrete steps to overcome negotiation barriers. In First Nations‐Crown treaty negotiations in BC, they are facing a number of process and substantive issues that pose significant challenges and must be overcome in order to reach treaties, agreements and other constructive arrangements.

Although the they remain committed to the made‐in‐BC approach to negotiations and to assisting First Nations in achieving full and comprehensive treaties as a primary objective, they fully respect and support decisions of any First Nation to enter into alternative agreements and other constructive arrangements to advance the interests and priorities of their respective nations.

FNS People

The executive of the First Nations Summit consists of a three (3) member political executive (FNS Task Group) and a two (2) member administrative executive (FNS Co-Chairs). Each member of the executive serves a three-year term upon election. The FNS currently has a permanent staff of ten who implement the mandate, as determined by resolutions passed by chiefs in assembly at First Nations Summit Meetings, which are held three times each year. They also partner with numerous other First Nation organizations, such as the BC Assembly of First Nations and the Union of BC Indian Chiefs to collectively work on issues of concern to BC First Nations.

FNS Information Management

As treaty negotiations are an ongoing activity, the bulk of the FNS’s records also remain active during the process, from the inception of the First Nations Summit in 1991 to the present. At the current rate of accumulation, the FNS will soon exceed their storage capacity. They have a large volume of paper records and are facing significant space limitations not to mention concerns regarding the structural integrity of the building and the increasing weight of their collections (they are on the 12th floor of an aging office tower).

The FNS also has a broad variety of collections – numerous types and formats of records, including typical paper records, meeting binders, verbatim transcripts, bound publications, as well as video tape and other formats. All of this content is historically and continuously important, informing current activities. However, until recently, this important collection of material was entirely unavailable offsite.

 

History of the Project

On December 5, 2007, the ARMA Vancouver Chapter hosted an educational event at the City of Surrey, BC, demonstrating the City’s pilot SharePoint implementation. This demonstration showcased the features and capabilities of the SharePoint Records Centre as well as the collaboration features inherent in the platform. At the same time, the FNS began to realize the accumulation of their records was accelerating beyond the capacity of existing storage units (various file, binder, video and library cabinets). They were looking for alternative solutions for the maintenance of their collections that would address the growth of physical records collections. The FNS recognized they would need to rely on digital records collections to address this growth.

After much research, consideration and investigation, the FNS subsequently purchased and began implementation of SharePoint 2007 in August 2009. They later upgraded to the new release of SharePoint 2010 the following year.

Initially the FNS attempted to install and operate SharePoint without any external support, however, they promptly realized that due to the complex nature of the software they would need additional, specialized assistance to succeed with their implementation. Therefore, the FNS hired a consulting firm to set-up SharePoint 2010 with native Records Centre capabilities and some customisations to meet their unique needs. Unfortunately, the consulting firm they hired did not have adequate knowledge of the SharePoint Records Centre or RIM compliance requirements. Ultimately, this project failed as the configuration and capabilities of SharePoint out-of-the-box did not provide satisfactory RIM compliance, structures and protocols out-of-the-box.

At the 2013 ARMA Canada Conference in Saskatoon, the FNS Records Manager participated in the two- day SharePoint pre-conference delivered by Bruce Miller. During this session, Mr. Miller strongly recommended a small number of third party add-ons for RIM in SharePoint to meet RIM compliance requirements. Following his session, the Records Manager sought out and watched demos of many third party RIM add-ons while at the conference and then met with representatives of these vendors following the conference.

The FNS eventually chose to purchase Collabware CLM. This decision was based on several factors, most importantly the in-place records management feature as the best solution for staff and existing workflows. They relaunched the project with the new SharePoint 2013 platform and Collabware in August 2013.

The FNS have taken a unique approach to SharePoint by ensuring Records Management compliance through Collabware was fully configured before any content was uploaded, and the SharePoint interface was functional and relatively intuitive for users. They also decided to invest significant time and effort to populate the SharePoint site before going live. They have, therefore, spent much of the last few years doing extensive conversion of legacy records in various formats and uploading this content into SharePoint to enhance the user experience when launched to staff. It was also determined that in order to facilitate the change management process they would need to provide sufficient materials to demonstrate the utility and functionality of SharePoint. Furthermore, they used this process to gain additional experience and proficiency with the uploading of records in SharePoint to be better able to assist when users have questions or concerns. This process has, therefore, allowed the development team to work through any ‘glitches’, or anomalies as they have encountered them, saving users the frustration of experiencing them unaided. As of the date of this article, the FNS has over 12,250 documents and several thousand hours of labour invested in the implementation.

A quick note about the network infrastructure

Back in 2009 the FNS initiated the SharePoint 2007 implementation using dedicated hardware – three physical servers to host SharePoint server, SQL server, and a sandbox for testing. All servers were hosted on site with internal static IPs.

By 2010, they had already begun the process of virtualizing the other server applications to facilitate maintenance, failover, and recovery processes – when upgrading to SharePoint 2010 all of the associated SharePoint server applications were subsequently virtualized.

As of June 2018, The FNS launched a new project to upgrade to SharePoint 20162, Collabware CLM v3.2, SQL Server 2017 and Office Online Server. This solution was built in a test environment parallel to the existing SharePoint 2013 implementation. Once the functionality of the new SharePoint 2016 platform had been confirmed SharePoint 2013 was shut down and the related Domain Naming System (DNS) records were updated to go live with the 2016 platform. The SharePoint 2016 implementation is now running within a virtualized environment with a total of four SharePoint-related servers.

It should be noted that the FNS continue to maintain all of their servers on premise due to proprietary, confidentiality, privacy and security concerns and lack of sufficient reliable and trustworthy cloud services within Canada and within British Columbia specifically. There remains a lack of transparency with respect to the exact location of user data on cloud server farms and insufficient Canadian owned and operated cloud services available. Most cloud options in Canada are owned by extraterritorial corporations, making the data potentially subject to foreign government search and seizure, specifically the provisions of the U.S. Patriot Act. At this time the cloud remains far too ambiguous with respect to ownership, jurisdiction and protection of data for the Summit to feel confident in placing their intellectual property in the cloud.3

 

FNS Goals

The FNS had several goals they hoped to achieve in adopting SharePoint in addition to Records Management:

  • Resolution of the shared drive chaos;
  • Elimination of redundant versions and copies of documents;
  • Reduction of email clutter;
  • Improved
Shared drive issues

Like many organizations, the FNS shared drives had grown out of control over the years. There are complex subjective folder structures which represent individual interpretations of the material and which are entirely indecipherable to coworkers and often to the staff member themselves over time. While attempts had been made to impose order on the shared drive, all attempts proved futile as staff resisted change and continued to work within their existing folder structures.

SharePoint Search has alleviated a great many challenges with respect to the shared drive – most significantly the indexing of the shared drive as an external repository and revealing the location of relevant records/documents in the complex folder structures.

Redundant copies

The proliferation of multiple redundant copies and numerous versions of individual documents has also proved to be a significant impediment to efficient workflow. These various versions and copies exist across numerous folders for various staff members, often appearing in more than one folder within a single staff member’s folder series. Rarely are the drafts, disparate versions or “old” copies deleted from staff folders, further complicating the identification of the authoritative document or record. These versions and copies are difficult to browse and almost impossible to search with common desktop search tools.

SharePoint allows staff to maintain a version history for each document. At any time, a document can be compared with and reverted to a previous version. This means that all staff can be working on and editing a single document without needing to save and merge multiple versions. Also, SharePoint enables multiple users to be actively editing a single document at once, again removing the need for more than one copy of the document.

 

Figure 1: This document being edited on SharePoint with Word. Note the Version History on the right-hand side

 

Email

Email has long been the preferred method for collaboration amongst staff, executive and consultants. This excessive use of email to collaborate on documents and projects has caused significant clutter in staff and executive email inboxes and substantial difficulties of merging the various changes into a single document, further compounding the confusion of versions and copies as noted above.

While they have not as yet eliminated this process of collaboration, the FNS is working on new collaboration sites within SharePoint to address specific projects and staff objectives. They are hoping that staff will gradually migrate away from email to these ‘team sites’ to eliminate email clutter and excessive growth of email inboxes and secondary email folders.

Collaboration

Collaboration was another consideration – as described above, collaboration has been achieved through long email strings, resulting in disparate versions of a single document needing to be amalgamated. The FNS wanted to consolidate and streamline the versioning process to make collaboration more efficient. As a corollary to collaboration, they also sought to provide better access to additional content for stakeholders without geographic restrictions, more specifically, access to the historic treaty collections. The FNS are also hoping that increased engagement with SharePoint team sites will reduce the volume of disparate versions, with such evolving titles as “document-draft1”, “document-draft2”, “document- final”, and so on.

The FNS decided that their implementation would be engineered to have governance structures and protocols applied to the content up-front to limit, if not prevent, the proliferation of “free-range” content as with the shared drive. Rather than allowing end users to create content areas however they like, the core Records Management team is responsible for adding new content areas with user consultation, allowing them to maintain control and consistency.

The FNS also have the issue of older, stale-dated, content as software has been discontinued and the acceleration of software versioning has made older content no longer accessible – for instance they have a large volume of legacy WordPerfect documents but no longer use WordPerfect. They are currently engaged in a document conversion process to maintain these records and make them accessible within SharePoint. The bulk of these documents are being converted to .pdf format, uploaded into SharePoint and then deleted from the shared drive in a process to eventually eliminate the shared drive altogether.

 

Classification in the digital environment

The FNS began with their alphanumeric paper records classification of over 5000 categories, which is managed in a simple DBTextworks database for their paper records filing. This classification is very granular to assist efficient records retrieval. Obviously, they had to rethink how their classification scheme would transition into the digital environment.

FNS has a large set of metadata terms, enforcing standardized descriptions and organization as well as a ‘tags’ or ‘subject” folksonomy that users can contribute to, to further describe the records. By leveraging the power of metadata, they were able to reduce the total categories to an initial 96 ‘big buckets’ within which to organize their records. They have subsequently expanded these categories somewhat to allow for variations of retention and information capture.

Some of the key differences between a paper records-based file plan and an electronic records file plan are the use of record categories as metadata, physical limits on how many records can fit into an individual location, and security.

Metadata can be used to classify content, regardless of location

Compared to a paper filing system, where documents of a particular classification are typically filed together in the same physical location, in an electronic content management system, the location of content within the system does not need to be tied to its record category. Metadata on the documents can be used to correctly classify content no matter where it “lives”.

Classification does not need to be used as metadata

Since we are alleviated from tying the location of a document to its classification, there is also no need to break out into subcategories to allow for meaningful grouping of documents into manageable chunks. Many documents can reside in one single record category and use metadata to split them out however wanted. For example, rather than having a set of “by year” record categories, which are essentially using the record category to add metadata to the document, they can have a single category and use metadata in the system to separate the documents by year.

Many documents can reside in a single location

They are also much freer in terms of limits on volume. In the physical world, it may be painful to have thousands of records in a single category as it would become increasingly difficult to find any specific content within that classification. However, since the location and metadata are separated from the classification and retention in the electronic world, this is not a problem. There can be 200,000 records in a single classification that are spread out into multiple libraries in the system. Users are still able to find what they are looking for using metadata and search, much how most in the modern world use iTunes or Spotify to easily locate a single song in a database of millions.

Classification has been mostly decoupled from security, except for semi active content

While the above points free us in our use of Record Categories to simplify our file plan, security is a constraint that remains in the electronic records management world. In addition to SharePoint’s built-in security model, which allows us to define who can view and/or edit documents, we need to allow for a new case for semi-active records. If documents reach a point in their lifecycle when they are no longer relevant to users, they are sent to a repository so that they are retained as needed but are not cluttering up the document libraries that people are using to locate relevant documents. How physical content is classified in certain categories may be retained together in locked cabinets or rooms, we enforce security by classification for semi-active content. The result is that all content in a given classification will have the new security applied to it once it is moved to a repository, regardless of what the security of the individual items was during their active stage.

 

Retention in the digital environment

The FNS also leveraged Collabware’s automation capabilities for their retention policies. Through the development of workflows, using the visual workflow editor, they have created a workflow set that more precisely reflects the lifecycle of their records and they have applied these workflows throughout the collection using Content Rules applied to custom content types. Content Rules are a feature in Collabware CLM that allows the system to identify, classify, and declare records automatically through the use of rules. These rules are able to identify the correct records by the use of unique combinations of Content Types applied to libraries and metadata entered by users or provided by the documents.

An example of a content rule may be: “All Documents that are of Content Type Meeting Document, that are PDFs and have Final selected by a user as their status.” In this example, the combination of the type of content as well as the metadata provided by a user, can be used to automatically classify this document as a Record in the correct Record Classification without any direct action by the end user.

They may not even be aware it has occurred, as they were simply following their business process for adding documents to the system.

Once classified, content will follow the flow of the lifecycle management workflow associated to each Record Category. Many actions and flows are possible. Some examples are that a document can be locked down, have different security applied, be sent to a repository, go through a review process, and have a final disposition action.

 

Figure 2: An example of a Collabware CLM Lifecycle Management Workflow4

 

Navigation within the Site

The structure and layout of the homepage was largely developed based on user consultation – it was decided that it would be best to follow the highest-level classification structure, moving from broad categories as the launching point to the more granular organization of the records. Each category is shown on the home page with an icon so that it can be easily identified and selected by users. The most commonly accessed areas grouped under the heading “Business of the Summit”, are also listed in a menu at the top of the screen which reveal sub areas in a list when the cursor is pointed at them.

 

 Figure 3: FNS SharePoint Site Home Page

 

Within each category are subcategories and libraries of documents. The different categories are listed across the top of the page, while libraries are listed along the left-hand side.

 

 Figure 4: An Example of one subcategory area

 

Once in a specific document library, metadata is used to assist users to search, sort, filter, and group documents by their metadata in order to more easily find the documents they are searching for.

 


Figure 5: An example of a document library with metadata shown in several columns5

 

SharePoint Challenges

During the project implementation the FNS encountered a number of challenges with SharePoint:

  • Vocabulary and concept Issues;
  • Lack of Microsoft product support documentation and resources;
  • Naming conventions
Vocabulary and Concepts

Misunderstandings were frequent and often unavoidable in discussions between the Records Manager and the first team of consultants due to the overlapping vocabulary of SharePoint and established Records Management terminology. Concepts such as “policy” and “archives”, both of which have specific meanings in both SharePoint and established RIM vocabulary became pain points for moving forward with the project. The importance of working with a consulting firm that has a well-developed understanding of records management as well as SharePoint’s Records Centre cannot be overstated, as this knowledge will save countless hours of frustration arising from misunderstanding of terms.

Likewise, the SharePoint “content type”, an entity in SharePoint which is employed to apply metadata columns, workflows, settings and other forms of data capture to a piece of content, was a particularly difficult concept to understand initially. The content type’s function and relationships within the SharePoint environment were not immediately transparent and not clearly explained for comprehension. As the project has continued, the concept of a ‘content type’ has gradually became clearer as the layers of function were revealed, more types of content were uploaded, and more content types were created to describe each of these new types of content.

Support Documentation and Resources

Additionally, and frustratingly, Microsoft does not produce software user manuals to assist with the learning process and few tools or online resources are available for the ‘lay-user’. There are a variety of third party manuals available on the market, however, many of these target expert admin users, are expensive and at the rapid rate of change, quickly obsolete. More and more these third party manuals reference the cloud-based Office 365 SharePoint Online offering rather than the on-premise server implementation that the FNS has. The FNS eventually overcame the lack of supporting documentation through repetition of activities and tasks, as well as significant input from the new team of consultants. The FNS have subsequently created both a User Guide specific to their SharePoint site, and a Governance Manual to assist users and administrators to understand and/or maintain and adapt the implementation to keep up with changing user needs.

Naming Conventions

As for naming conventions, SharePoint utilizes two distinct fields for the description of documents, name and title. The name field directly contributes to the 256-character limit for the SharePoint URL as the name forms part of the URL locator for retrieval of the record. This is significant, as they learned the hard way when their content became unavailable once the URL exceeded the 256-character limit imposed by Microsoft. The implications of the name field were initially unknown to them as they began uploading content with long and descriptive names as employed by staff on the shared drive. After several panic attacks and a rather emotional melt-down when their efforts were discovered to have been obliterated by these descriptive names, they finally had to accept that they had ventured forth using the wrong field, leading to substantial intervention from the consultants and many hours of data revision.

The FNS have subsequently adopted a file Naming Best Practice based on one published by Stanford University Libraries, incorporating international date formatting for clarity. They have utilized the title field for more the descriptive name to indicate what the substance of the record is to the user.

 

 Figure 6: An example of documents that have short filenames but descriptive titles

 

SharePoint victories

The FNS finally launched SharePoint to their end users in early 20186 and they are currently engaged with ongoing group training sessions and individual support activities with staff, executive and consultants. Through a series of lunch and learns they have acquainted users with the structural elements of the implementation in order to familiarize staff with the platform. The FNS are now holding smaller group sessions to focus training on different interactions with the content, such as site navigation, working within libraries utilizing column filtering and sorting, local search, and paging forward to view results.

Search

SharePoint search has been the single biggest victory for staff buy-in. SharePoint search allows much more effective searching of content than was previously available through the existing desktop search tools, saving much time and frustration among staff.

SharePoint search also improves the experience for content that has not yet been migrated to SharePoint or content that will not be migrated, as they have configured SharePoint search to index the shared drives to allow users to search content across all of our digital collections more efficiently.

SharePoint search improves upon common windows search by searching not only the document title and metadata, but also the content of the documents themselves in the fashion of keyword searching. There is one caveat however, the shared drive search results can only be viewed and accessed by users within our office network.

 

Figure 7: An example of a search, showing that content from both SharePoint and the File Shares are shown, as well as that content within the documents is searched

 

Offsite Access

FNS wants their users to be able to interact with all their resources no matter where they may be – the executive are rarely in the office, and staff are often involved in offsite meetings with their constituents or with partner organizations.

Therefore, they created a targeted remote access workstation for external access. Offsite users open a remote session with the designated workstation on the office network to utilize the SharePoint search when they are not physically in the office. Content on the shared drive can then be identified and uploaded to SharePoint during the remote session to make it available to all users through a generic browser session.

FNS legal counsel have been particularly active using these remote sessions to search and acquire relevant records on the fly without having to engage our staff. They are also in the process of learning how to upload content they have created on behalf of the FNS.

Research Tools

The FNS have also recently utilized Collabware’s Information Query tool and the custom ‘FNS Topic’ folksonomy to feature research results in a single list where these records exist in disparate locations throughout the SharePoint libraries. This query tool has been enormously useful in aiding the FNS in their ability to respond to queries from legal counsel and to support research efforts.

 


Figure 8: An example of an Information Query being used to have a single list content related to a specific topic, regardless of where the documents are actually stored

 

Where the FNS is going from here

 The FNS is currently in the process of developing the collaborative functionality of SharePoint. New staff and legal counsel have requested the development of team sites to facilitate collaboration on various activities, including sites for working groups, committees and research activities. They expect SharePoint to continue to evolve to address new concerns, needs and capabilities as Microsoft also continues to evolve their software. The FNS is aware of the new SharePoint 2019 development and will consider making the upgrade when it is eventually released – though not before the bugs and hiccups have been resolved.

 

Work Cited

1This paper was originally delivered by the authors as a presentation for the ARMA Canada Conference in May 2018, in Vancouver, BC.
2All of the purchases of SharePoint Server and associated User CALs (client access licenses) have been facilitated through TechSoupCanada.ca which offers special, deeply discounted pricing for non-profit and charitable organizations.
3The Office of the Privacy Commissioner of Canada has some useful links for Cloud computing and privacy, however, the most recent of these articles dates to June 2012 which is much too old to be considered current, considering the rapid rate of change in technology. The issues of ownership and jurisdiction of data stored in the cloud are not addressed.
4It should be noted that we have implemented a one-year delay for declaration of a document as a record to allow for additional modification of the document and correction of errors.
5The FNS has implemented a column, “True Document Date” to indicate the actual date of the document as recorded on the document itself as distinct from the machine date applied when the document is uploaded. This is particularly important to assist in identifying historical documents which have been uploaded often years after their creation or receipt.
6It should be noted here that the document conversion process has been substantially time-consuming, leading to the significant time lapse between implementation and launch.

Quand est-ce qu’on arrive?
Introduction à l’implémentation de SharePoint & Collabware au First Nations Summit

 

Sandra Dunkin, MLIS, CRM, IGP
Coordonnatrice Gestion des Archives & Information
First Nations Summit

 

Sam Hoff Consultant ECM Gravity Union

 

Introduction1

L’objectif de SharePoint et son utilisation dans le cadre de la gestion des archives est d’intérêt pérenne dans la communauté de la gestion des archives et de l’information (GAI), et ce, dès que Microsoft a décidé de faire son entrée sur ce marché avec SharePoint en 2001. Au fil des diverses versions de la plateforme SharePoint, Microsoft a veillé à assurer des capacités de gestion des archives non seulement en version originale ou grâce à des configurations particulières, mais plus récemment grâce à des applications complémentaires de tierces parties, afin de régir la conformité et l’automatisation de la classification et l’élimination des archives. De nombreuses études de cas et sessions d’information ont été offertes par Microsoft, par des fournisseurs complémentaires associés ainsi que par des experts en la matière, tel que Bruce Miller. Chaque année, des organisations spécialisant dans la gestion de l’information comme ARMA International, ARMA Canada, les diverses branches d’ARMA ainsi que plusieurs organisations internationales professionnelles de gestion de l’information, continuent d’offrir des sessions SharePoint dans le cadre de leurs programmes de formation. SharePoint demeure un sujet chaud dans l’industrie du GAI, occupant une part croissante du marché alors même que des développeurs tiers ne cessent d’améliorer leurs applications pour les rendre aptes à gérer des tâches de GAI complexes. Plusieurs organisations sont désireuses d’obtenir des solutions multifonctionnelles pour leurs besoins d’affaires, plutôt que d’investir dans des logiciels spécialisés pour des activités de GAI compartimentées, rendant SharePoint une alternative intéressante aux applications logicielles onéreuses, spécialisées et souvent modulaires.

L’étude de cas décrite ci-après donnera une vue d’ensemble du projet d’implémentation de SharePoint au First Nations Summit (FNS), soulignant les leçons apprises, erreurs, échecs et éventuels succès. Depuis le commencement du projet, le FNS s’est efforcé de créer une solution polyvalente afin de répondre aux  besoins évolutifs des membres  du  personnel, de  la direction et des  consultants,  lors   du travail quotidien, ainsi que de combler les exigences de GAI pour une grande quantité d’archives. Les défis de gestion du changement, de formation et de ralliement des utilisateurs, de mises à jour variées et de l’environnement changeant créé par des utilisateurs engagés plus activement dans SharePoint seront aussi abordés.

 

Quelques mots au sujet du First Nations Summit

Le First Nations Summit, composé majoritairement de Premières Nations et Conseils de bande de la Colombie-Britannique, offre un forum aux Premières Nations de la C.-B. pour aborder les enjeux en lien avec les Titres et Droits, ainsi que la négociation de traités et autres points d’intérêt commun.

En Octobre 1990, les chefs des Premières Nations de la C.-B. ont rencontré le Premier Ministre du Canada, puis le Premier Ministre et le Cabinet de la Colombie Britannique pour exhorter l’importance de la nomination d’un comité tripartite mandaté de développer un processus moderne de négociation de traités en C.-B. Les gouvernements fédéral et provincial en sont venus à un accord et le 3 décembre 1990, le Groupe de Travail sur les Revendications fut établi suite à l’accord entre le gouvernement du Canada, le Gouvernement de la Colombie-Britannique et des chefs représentants des Premières Nations.

Les chefs de Premières Nations de partout en Colombie-Britannique ont assigné trois membres au Groupe de travail sur les revendications, au cours d’une réunion intitulée le First Nations Summit. Deux membres ont été désignés par le gouvernement du Canada, et deux par le gouvernement de la  Colombie Britannique. Au terme de plus de cinq mois et demi de délibérations, le Rapport du Groupe de travail sur les revendications de 1991 a recommandé que les Premières Nations, le Canada et la Colombie-Britannique  établissent  une  nouvelle  relation  basée  sur  la  confiance  mutuelle,  le respect et la compréhension — par la voie de négociations politiques.

 

Rôle du FNS

Le FNS est une organisation menée par les Premières Nations et axée sur les actions et les solutions. Le mandat initial du FNS était d’assurer le progrès des discussions avec les gouvernements du Canada et   de la C.-B. afin de soutenir les Premières Nations qui conduisent leurs propres négociations directement avec ceux-ci. La raison d’être du mandat émane de:

  • Le Rapport tripartite de 1991 du Groupe de travail sur les revendications, élaboré conjointement par les Premières Nations, le Canada et la -B.,
  • L’accord de 1992 de créer la Commission des traités de la C.-B. en tant que corps indépendant afin de « faciliter » la négociation des traités, et
  • Les législations fédérale et provinciale subséquentes et les résolutions des chefs des Premières Nations d’implémenter l’accord de 1992 et d’établir la Commission des traités de la C.-B. en tant qu’entité légale

Approximativement 150 Premières Nations de la C.-B. participent aux assemblées du FNS pour faire avancer, discuter et donner une direction politique aux enjeux d’intérêt commun.

Dans le cadre de l’accomplissement de son mandat, le FNS ne participe pas en tant que partie négociante dans quelque négociation spécifique des Premières Nations que ce soit. Au fil du temps, et à travers les décisions collectives des chefs et leaders de Premières Nations, tel qu’indiqué par les résolutions, ils ont été avisés de n’assumer qu’un rôle de direction et de plaidoyer pour la vaste gamme d’enjeux d’intérêt pour les Premières Nations, incluant négociations et problèmes d’implémentation de traités, d’accords et autres arrangements constructifs et questions sociales  et  économiques  de  tous les jours affectant les Premières Nations.

Un élément critique du travail du FNS est l’identification de mesures concrètes pour surmonter les obstacles aux négociations. Lors des négociations entre les Premières Nations et la Couronne en C.-B., ils feront face à certains problèmes de processus et de fond qui représentent des défis significatifs et qui doivent être surmontés afin d’en venir à des traités, accords, ententes et autres arrangements constructifs.

Malgré qu’il maintienne son engagement à l’approche « fait-en-C.-B. » au sujet des négociations et de l’assistance offerte  aux  Premières  Nations  pour  réaliser  des  traités  pleins  et  entiers  comme objectif premier, le FNS respecte et appuie les décisions de toute Première Nation  à s’engager dans  tout autre accord ou autre arrangement constructif, dans le but de  promouvoir  les  intérêts  et  priorités de leurs nations respectives.

L’équipe du FNS

Le comité exécutif du First Nations Summit consiste en un groupe exécutif politique de trois (3) membres (FNS Groupe de travail)et d’un groupe exécutif administratif de deux (2) membsr(eFNS

coprésidents). Chaque membre du comité exécutif est investi d’un mandat de trois ans à son élection. Le FNS a présentement une équipe permanente de 10 personnes qui mettent en œuvre le mandat tel que déterminé par les résolutions adoptées par les chefs en assemblée lors des réunions du First Nations Summit, lesquelles sont tenues trois fois par année. Il est également en partenariat  avec  de nombreuses autres organisations des Premières Nations, telles que l’Assemblée des Premières Nations de la C.-B. et l’Union des Chefs Indiens de la C.-B., afin de travailler collectivement sur des enjeux d’intérêt aux Premières Nations de la C.-B.

Gestion de l’information au FNS

Puisque la négociation de traités est un travail constant, l’ensemble des dossiers du FNS demeure   actif tout au long du processus, depuis la création du First Nations Summit en 1991 jusqu’à ce jour. Au rythme actuel d’accumulation, le FNS va bientôt surpasser sa capacité de stockage. Ils ont un volume important de dossiers en version papier et font face à de sévères contraintes d’espace sans parler    des préoccupations en ce qui concerne l’intégrité structurale de l’édifice et le poids grandissant de leurs collections (ils sont au 12e étage d’une tour à bureaux vieillissante).

Le FNS a également une vaste gamme de collections — dossiers de types et formats variés, incluant des dossiers typiques en papier, classeurs de réunions, transcriptions textuelles, publications reliées, ainsi que des enregistrements vidéo  et  en  autres  formats.  Tout  ce  contenu  est  important  historiquement et toujours d’actualité, façonnant les activités courantes. Malgré cela, jusqu’à tout récemment, cette importante collection de documents n’était accessible que sur place.

 

Historique du projet 

Le 5 décembre 2007, la section d’ARMA de Vancouver a organisé un évènement de formation à Surrey, en C.-B., faisant une démonstration de l’implémentation pilote de SharePoint pour le compte de la ville. Cette démonstration mettait de l’avant  les  caractéristiques  et  capacités  du  Centre  de  documentation SharePoint ainsi que les fonctions de collaboration inhérentes à la plateforme. Au même moment, le FNS a commencé à réaliser que l’accumulation de dossiers s’accélérait au-delà de la  capacité de leurs unités de  stockage existantes (dossiers, classeurs,  vidéos et classeurs de collection).  Ils étaient à la recherche d’une solution alternative pour la gestion de leur collection qui tiendrait compte de l’expansion de leur collection  de  dossiers  physiques.  Le   FNS   a   reconnu   avoir   besoin  de collections de documents électroniques pour garder le contrôle sur cette expansion.

Après  beaucoup  de  recherches,  de  considération  et  d’enquête,   le   FNS   a   subséquemment   acquis SharePoint 2007 et débuté son implémentation en août 2009. Ils ont ensuite fait une mise à niveau à la parution de SharePoint 2010, l’année suivante.

Initialement, le FNS a tenté d’installer et d’exploiter SharePoint sans support externe, cependant ils ont rapidement réalisé qu’en raison de la nature complexe du logiciel, ils auraient besoin de soutien spécialisé additionnel pour réussir leur implémentation. Par conséquent, le FNS a engagé une firme de consultants afin de configurer SharePoint 2010 pour développer les capacités locales du centre de documentation et configurer quelques adaptations pour satisfaire des exigences particulières. Malheureusement, la firme engagée ne possédait pas les connaissances nécessaires sur le Centre de documentation SharePoint ou les exigences de conformité au GAI. Ultimement, ce projet a échoué puisque la configuration et les fonctions originales de SharePoint n’offraient pas  une conformité  au  GDI satisfaisante ni des structures ou des protocoles originaux.

Lors de  la  conférence  d’ARMA  Canada  à  Saskatoon,  en 2013,  le  Gestionnaire des  Archives  du  FNS a participé à la préconférence SharePoint présentée par Bruce Miller, qui a duré deux jours. Lors de cette session, M. Miller a fortement recommandé quelques compléments tiers pour GAI dans SharePoint conçus dans un but de conformité des exigences GAI. Suite à cette session, le Gestionnaire des Archives a assisté à des démonstrations de plusieurs fournisseurs de  compléments  tierces  pour GAI pendant la conférence puis a rencontré les représentants de ces fournisseurs après la conférence.

Le FNS a  éventuellement  décidé  d’acquérir  Collabware  CLM.  Cette  décision  fut  prise  sur  la  base  de plusieurs facteurs, le plus important étant la fonction de gestion des documents qui était déjà en place, étant la meilleure solution pour le personnel et le déroulement du travail. Ils ont relancé le projet avec la nouvelle plateforme SharePoint 2010 et Collabware, en août 2013.

Le FNS a opté pour une approche unique à SharePoint en s’assurant que la conformité à la gestion des documents par l’entremise de Collabware soit pleinement configurée avant que quelque contenu que ce soit y soit téléchargé, et que l’interface SharePoint soit fonctionnelle et relativement intuitive pour les utilisateurs. Ils ont également décidé d’investir significativement temps et effort pour alimenter le site SharePoint avant la date de son lancement. Ils ont donc passé une grande partie des quelques dernières années à effectuer la conversion exhaustive des documents patrimoniaux de formats variés et le téléversement de ce contenu dans SharePoint afin d’améliorer l’expérience utilisateur au moment du lancement aux utilisateurs. Il a aussi été déterminé qu’en vue de faciliter le processus de gestion du changement, ils devraient fournir suffisamment de matériel pour démontrer l’utilité et le bon fonctionnement de SharePoint. De plus, ils ont utilisé ce processus pour acquérir plus d’expérience et d’efficacité lors du téléversement de documents dans SharePoint, afin d’être mieux outillés à aider les utilisateurs qui pourraient avoir des questions ou des problèmes. Ce processus a donc permis à l’équipe de développement de régler les pépins ou anomalies au fur et à mesure qu’ils survenaient, évitant ainsi aux utilisateurs la frustration de les affronter sans assistance. À ce jour, le FNS a plus de 12 250 documents et plusieurs milliers d’heures de travail d’investis dans l’implémentation.

Petite note à propos de l’infrastructure du réseau

En 2009, le FNS a débuté l’implémentation de SharePoint 2007 avec du matériel informatique dédié

  • trois serveurs physiques pour héberger le serveur SharePoint, le serveur SQL, et un environnement de test pour effectuer des tests. Tous les serveurs étaient situés sur place, avec des adresses IP statiques

Puis en 2010, ils avaient déjà entamé le processus de virtualisation des autres applications de serveur afin de faciliter l’entretien, le basculement et les processus de récupération — alors suite à la mise à niveau à

SharePoint 2010, toutes les applications des serveurs SharePoint associés ont été subséquemment virtualisées.

Depuis juin 2018, le FNS a lancé un nouveau projet pour la mise à niveau à SharePoint 20162,  Collabware  CLM  v3.2,  SQL  Server  2017  et  Office  Online  Server.   Cette   solution   a   été  développée dans un environnement  test,  parallèle  à  l’implémentation  de  SharePoint  2013  existante. Une fois que le bon fonctionnement de la nouvelle plateforme SharePoint 2016 a été confirmé, SharePoint 2013 a été fermé et les documents de Système de Noms de Domaines (SND) associés ont été mis à jour pour redémarrer avec la plateforme 2016. L’implémentation de SharePoint 2016 fonctionne maintenant dans le cadre d’un environnement virtuel avec un total de quatre serveurs SharePoint associés.

Il  convient  de  noter  que  le  FNS   maintient  en  fonction  tous  ses  serveurs  sur  place,  en  raison     de préoccupations d’exclusivité, de confidentialité et de sécurité, et le manque de services de nuage qui sont fiables et dignes de confiance au Canada  et  en  Colombie-Britannique  spécifiquement.  Un manque de transparence subsiste en ce qui concerne l’emplacement exact des données utilisateur dans les parcs de serveurs web et il n’y a pas suffisamment de services de nuage détenus et exploités au Canada. La majorité des options de nuage au Canada sont détenus par des corporations à l’extérieur du territoire, rendant les données potentiellement  sujettes  à  la  perquisition  et  la  confiscation  de  la  part de gouvernements étrangers, notamment les dispositions du Patriot Act  américain.  En  ce  moment, le nuage demeure beaucoup trop ambigu quant à la propriété, la juridiction et la protection des données pour que le FNS y place sa propriété intellectuelle avec confiance. 3

 

Objectifs du FNS

Le FNS a plusieurs objectifs qu’il espère atteindre grâce à l’adoption de SharePoint en complément de la Gestion des Dossiers :

  • Résolution du chaos des disques partagés;
  • Élimination des versions et copies redondantes de documents;
  • Réduction de l’encombrement des boites de réception de courriels;
  • Collaboration améliorée.
Enjeux des disques partagés

Comme plusieurs autres organisations, le  FNS  avait perdu  le contrôle de ses disques  partagés au fil des années. Certaines  structures  de  dossiers  sont  subjectives  et  complexes  et  correspondent  à  des interprétations individuelles du contenu et totalement indéchiffrables pour des collègues et  souvent aux employés eux-mêmes au fil du temps. Alors que des tentatives ont  été  faites  pour imposer un certain ordre dans le disque partagé, toute démarche fut en vain puisque les employés résistaient au changement et continuaient à travailler à l’intérieur de leurs structures de dossier existantes. La fonction de recherche de SharePoint a atténué plusieurs des défis en ce qui concerne le disque partagé — plus précisément l’indexation du disque partagé en tant que dépôt externe et la révélation de l’emplacement de documents pertinents dans les structures complexes des dossiers.

Copies redondantes

La prolifération de multiples copies redondantes et de nombreuses versions de documents individuels s’est aussi révélée être un obstacle significatif au déroulement efficace des opérations. Ces versions et copies variées existent au travers de multiples dossiers et pour plusieurs employés, souvent figurant dans plus d’un dossier de l’ensemble des dossiers d’un même employé. Les ébauches, versions disparates ou anciennes versions sont rarement supprimées des dossiers des employés, compliquant encore davantage l’identification du dossier ou document officiel. Ces versions et copies sont difficiles à naviguer et presque impossibles à perquisitionner avec des outils de recherche courants.

SharePoint permet aux employés de préserver un historique des versions pour chaque document. À tout moment, un document peut être comparé avec ou rétabli à une version précédente. Ce qui signifie que l’ensemble des employés peuvent travailler sur, et modifier, un même document sans avoir besoin d’enregistrer et de regrouper plusieurs versions. De plus, SharePoint permet à plusieurs utilisateurs de modifier simultanément un même document, ce qui élimine la nécessité d’avoir plus d’une copie du document.

 

 Image 1 : Ce document est modifié dans SharePoint avec Word. Notez l’Historique des Versions à droite.

 

Courriels

Les courriels sont depuis longtemps la méthode préférée de collaboration entre employés, directeurs et consultants. Cette utilisation excessive des courriels pour travailler ensemble sur des documents et des projets cause un désordre significatif dans les boites de réception des employés et des membres de la direction, et des difficultés substantielles lors de la fusion des changements variés dans un seul document, consolidant encore plus la confusion des diverses versions et copies, comme mentionné précédemment.

Même si ce processus de collaboration n’a pas encore été éliminé, le FNS élabore de nouveaux sites de collaboration à l’intérieur même de SharePoint afin de pouvoir travailler sur des objectifs spécifiques à certains projets ou pour les employés. L’objectif est que le personnel migre graduellement des courriels à ces sites de collaboration afin d’éliminer l’encombrement des courriels et l’augmentation excessive de la taille des boites de réception et des dossiers secondaires de courriels.

Collaboration

La collaboration était une autre chose à considérer — tel que décrit plus haut, les collaborations sont accomplies avec de longues chaines de courriels, ce qui entraîne des versions disparates d’un même document nécessitant une fusion. Le FNS désirait consolider et uniformiser le processus de production de diverses versions pour rendre la collaboration plus efficace. Comme corollaire à la collaboration, ils ont également cherché à offrir un  meilleur  accès  à  du  contenu  additionnel  pour  les partenaires sans restrictions géographiques, plus spécifiquement,  un accès aux collections de  traités historiques. Le FNS espère  aussi  que  l’augmentation  de  l’utilisation  des  sites  de collaboration  de  SharePoint  réduira  la quantité  de  versions  disparates,  avec   des   noms changeants tels que « document-ébauche1 », « document-ébauche2 », « document-final », etc.

Le FNS a opté pour une implémentation conçue avec des  structures  de  gouvernance  et  des  protocoles appliqués au contenu initial afin de limiter, sinon prévenir, la prolifération de contenu indépendant comme dans le cas du disque partagé. Au lieu de permettre à l’utilisateur final de créer des espaces de contenu comme il le souhaite, le noyau de l’équipe de Gestion  des  Dossiers  est  responsable de l’ajout d’espaces pour le nouveau contenu qui peut être consulté par les utilisateurs, leur permettant de garder le contrôle et la constance.

Le FNS doit également faire face au problème de contenu plus vieux, périmé, pour lequel les logiciels  ont été discontinués ainsi que l’apparition rapide de nouvelles versions de logiciels  qui  rend inaccessible le contenu plus vieux — par exemple, ils ont une grande quantité de documents de référence WordPerfect, mais n’utilisent plus WordPerfect. Ils sont présentement impliqués dans un processus de conversion de ces documents afin de préserver ces archives et les  rendre  accessibles  dans le cadre de SharePoint. L’essentiel de ces documents sont convertis en format PDF, téléversés  dans SharePoint puis supprimés du disque partagé afin d’éventuellement éliminer le disque partagé complètement.

 

La classification dans l’environnement numérique

Le FNS a commencé avec la  classification  avec leurs documents  papier  alphanumériques,  dans  plus de 5000 catégories, répertoriées pour l’archivage dans une simple base  de  données  DBTextworks. Cette classification est très granulaire pour permettre une récupération efficace des documents. Évidemment, ils ont eu à repenser au passage de leur système de classification vers un environnement numérique.

Le FNS a un large vocabulaire de termes de métadonnées, imposant des descriptions et des organisations standardisées ainsi qu’une folksonomie d’étiquettes’ ou de sujets auxquels  les  utilisateurs peuvent contribuer, pour décrire plus amplement les dossiers. En exploitant le pouvoir des métadonnées, ils ont pu réduire le nombre total de catégories à 96 répertoires dans lesquels  les dossiers sont organisés. Ils ont subséquemment augmenté quelque peu ces catégories afin de  permettre des variations de sauvegarde et d’enregistrement de données.

Certaines des différences majeures entre les plans de classement de dossiers en format papier et ceux des dossiers électroniques sont l’utilisation de catégories en tant que métadonnées, les limites physiques de la quantité de documents pouvant être stockées dans un site unique, et la sécurité.

Les métadonnées peuvent servir à classer du contenu, peu importe l’emplacement

Comparativement à un système de classification papier, où les documents d’une classification particulière sont typiquement rangés ensemble à un même endroit physique, dans le cas d’un système de gestion du contenu électronique, la localisation du contenu à l’intérieur du système n’a pas besoin d’être liée à sa catégorie de dossier. Les métadonnées du document peuvent être utilisées pour le classifier correctement selon son contenu, peu importe où il « réside ».

La classification n’a pas besoin d’être utilisée comme métadonnée

Puisque nous n’avons pas à lier l’emplacement du document à sa classification, il n’y a non plus aucune nécessité de les séparer en sous-catégories dans le but de classer les documents en groupes ayant un sens et d’une taille appropriée. Plusieurs documents peuvent demeurer dans une seule et même catégorie de documents, et utiliser les métadonnées pour les trier de n’importe quelle autre façon. Par exemple, au lieu d’avoir une catégorie « par année », ce qui revient essentiellement à utiliser la catégorie de dossier pour ajouter des métadonnées au document, ils peuvent avoir une seule catégorie et utiliser les métadonnées dans le système pour séparer les documents par année.

Plusieurs documents peuvent résider dans un même emplacement

Ils sont également beaucoup plus libres en termes des limites de volume. Dans le monde physique, il pourrait devenir très difficile d’avoir des milliers de documents dans une même catégorie puisqu’il deviendrait de plus en plus ardu de trouver du contenu spécifique à l’intérieur de cette même catégorie. Par contre, puisque dans le monde électronique l’emplacement et les métadonnées sont séparés de la classification et de l’archivage, ce n’est plus un problème. Il peut y avoir 200 000 documents dans une même classification répartie en multiples collections dans le système. Les utilisateurs peuvent toujours trouver ce qu’ils cherchent à l’aide des métadonnées et de la fonction de recherche, tout comme la plupart des gens maintenant utilisent iTunes ou Spotify pour repérer facilement une chanson dans un répertoire de plusieurs millions.

La classification a été découplée de la sécurité dans l’ensemble, excepté pour le contenu semi-actif

Bien que les points mentionnés ci-haut nous affranchissent d’avoir à utiliser des catégories de dossiers pour simplifier notre plan de classement, la sécurité reste une contrainte dans le monde de la gestion de documents électroniques. En plus du modèle de sécurité intégré à SharePoint, qui nous permet de définir qui peut accéder et (ou) modifier des documents, il faut prévoir des mesures dans le cas des documents semi-actifs. Si un document n’est plus pertinent pour les utilisateurs, il est envoyé aux archives afin d’être conservé si nécessaire, mais n’encombre plus les collections de documents dont les utilisateurs se servent pour repérer des documents actuels. Le contenu physique est classé dans certaines catégories et peut être conservé dans un classeur ou une pièce verrouillée, et nous appliquons des mesures de sécurité pour la classification de contenu semi-actif. En fin de compte, la nouvelle mesure de sécurité sera appliquée à tout le contenu d’une classification donnée, une fois qu’elle est déplacée aux archives, peu importe les mesures de sécurité appliquées aux items particuliers pendant leur phase active.

 

Préservation dans l’environnement numérique

Le FNS a également mis à profit les capacités d’automatisation de Collabware  dans  le  cadre  de  leur politique d’archivage. Par l’entremise du développement des processus de travail, en utilisant l’éditeur visuel, ils ont créé un ensemble de flux de travail qui reflète plus précisément le cycle de vie de leurs documents, et ils ont appliqué ces processus à l’ensemble de la collection en utilisant les Règles pour le Contenu appliquées au type de contenu sur mesure. Les Règles pour le Contenu sont  une fonction de Collabware  CLM  qui  permet  au  système  d’identifier,  de  classer  et  de  déclarer   des documents automatiquement par  l’entremise  de  certaines  règles.  Ces  règles  sont  conçues  pour identifier les documents appropriés grâceà l’utilisation de combinaisons uniques de Types de Contenu appliquées à des collections et des métadonnées enregistrées par les utilisateurs ou fournies par les documents.

Un exemple d’une règle de contenu pourrait être : « Tout document qui est du type Document de Réunion, qui est en format PDF et a un statut final décrété par un utilisateur ». Dans cet exemple, l’association du type de contenu et  des  métadonnées  fournies  par  l’utilisateur  peut  être  utilisée  afin de classifier automatiquement ce document en tant qu’archive, dans la classe de document appropriée, sans intervention directe par l’utilisateur. Le mécanisme pourrait même passer inaperçu, puisque l’utilisateur poursuivait simplement le processus d’ajout de documents dans le système.

Une  fois  classé,  le  contenu  suivra  le  cours  du  processus  de  travail  associé  à  chaque  Catégorie   de Document. Plusieurs actions et processus sont possibles. Par exemple, un document peut être verrouillé, avoir un protocole de sécurité différent, être envoyé aux archives, subir un processus de révision ou même d’élimination finale.

Image 2 : Un exemple d’un Collabware CLM Lifecycle Management Workflow4

 

Navigation au sein du Site 

Le développement de la structure et de la disposition de la page d’accueil a été largement influencé par l’utilisation des employés — il a été déterminé que la meilleure façon de procéder était de suivre la structure de classification la plus élevée, partant de catégories larges comme point de départ jusqu’à l’organisation plus granulaire des documents. Chaque catégorie est affichée sur la page d’accueil avec sa propre icône afin d’être facilement identifiée et  sélectionnée  par  les  utilisateurs.  Les  sections  les  plus souvent consultées regroupées dans la rubrique « Summit Activities » sont également énumérées dans un menu au haut de l’écran, avec des sous-catégories énumérées lorsque le curseur y est pointé.

 

 Image 3 : Page d’accueil du site Web FNS SharePoint

 

Il y a des sous-catégories et des collections de documents dans chaque catégorie. Les différentes catégories sont énumérées au haut de la page, alors que les collections sont disponibles tout le long du côté gauche de la page.

 

Image 4 : un exemple de sous-catégorie

  

Dans une collection donnée, les métadonnées sont utilisées afin d’aider les utilisateurs à chercher, filtrer et regrouper des documents selon leurs métadonnées, dans le but de les retrouver plus facilement.

 

 Image 5 : Un exemple d’une bibliothèque de documents contenant des métadonnées figurant dans diverses catégories,5

 

Défis de SharePoint

Tout au long de l’implémentation, le FNS a fait face à un certain nombre de défis avec SharePoint :

  • Vocabulaire et problèmes de concept;
  • Manque de documentation et de ressources de soutien de produit de Microsoft;
  • Conventions d’appellation.
Vocabulaire et concepts

Les malentendus étaient fréquents et souvent inévitables lors de discussions entre le Gestionnaire des registres et la première équipe de consultants en raison du vocabulaire redondant entre SharePoint et la terminologie établie en Gestion des Documents. Des concepts tels que « politique » et « archives », ayant tous deux une signification spécifique en SharePoint et dans le vocabulaire établi de la GDR sont devenus des points sensibles dans l’avancement du projet. On ne peut trop insister sur la nécessité de travailler conjointement avec une firme consultante ayant une solide compréhension de la gestion de dossiers ainsi que du Centre des archives de SharePoint, puisque leur maîtrise du sujet évitera d’innombrables heures de frustrations découlant de la mauvaise compréhension des termes.

De la même façon, le « type de contenu » de SharePoint (une entité dans SharePoint qui sert à appliquer des colonnes de métadonnées, processus de travail, paramètres et autres types de saisies de données à du contenu), était initialement un concept particulièrement difficile à saisir. La fonction type de contenu et ses liens à l’intérieur même de l’environnement SharePoint n’étaient pas immédiatement clairs et n’étaient pas distinctement expliqués. Au fur et à mesure que le projet a évolué, le concept de « type de contenu » est graduellement devenu plus clair, et alors que les multiples niveaux de fonction étaient exposés, plus de types de contenu étaient créés et téléversés, pour décrire ces nouveaux types de contenu.

Documentation et ressources de soutien

De plus, et de façon exaspérante, Microsoft ne produit pas de manuels de soutien à l’utilisateur pour faciliter le processus d’apprentissage et peu d’outils ou de ressources en ligne sont disponibles à l’utilisateur » profane ». Il y a plusieurs manuels de parties tierces sur le marché, mais la plupart de ceux- ci ciblent une clientèle d’utilisateurs experts en administration, sont onéreux et, au rythme rapide de l’évolution du contenu, deviennent vite désuets. De plus en plus, ces manuels de parties tierces font référence à l’option d’Office 365 SharePoint Online sur le nuage plutôt que l’implémentation serveur  sur place qu’offre le FNS. Il a éventuellement surmonté l’obstacle du manque de documentation de soutien par la répétition d’activités et de tâches, ainsi que grâce à des contributions significatives de la nouvelle équipe de consultants. Le FNS a subséquemment créé à la fois un Guide pour l’Utilisateur spécifique à leur site SharePoint et un Manuel de gouvernance pour aider les utilisateurs et administrateurs à comprendre et\ou maintenir et  adapter  l’implémentation  et  ainsi  garder  la cadence face aux besoins changeants des utilisateurs.

Conventions d’appellation

Pour ce qui est des conventions d’appellation, SharePoint utilise deux champs distincts pour la description des documents, le nom et le titre. Le champ d’appellation contribue directement aux 256 caractères permis pour le lien SharePoint URL, puisque le nom fait partie intégrante du localisateur d’URL pour la récupération d’un document. C’est un détail important; ce qu’ils ont appris à leurs dépens quand leur contenu n’était soudainement plus disponible parce que l’URL avait dépassé la limite de 256 caractères imposée par Microsoft. L’importance du champ d’appellation leur était initialement inconnue alors qu’ils entamaient le téléversement de contenu ayant des noms longs et descriptifs comme ceux utilisés par les employés sur le disque partagé. Après plusieurs attaques de panique et un effondrement plutôt émotif lorsqu’ils ont découvert que leurs efforts avaient été oblitérés par leurs appellations, ils ont finalement dû se résigner à admettre qu’ils s’étaient aventurés en utilisant le mauvais champ, nécessitant une intervention majeure des consultants et beaucoup d’heures de révision de données.

Le FNS a subséquemment adopté une Meilleure  Pratique  d’Appellation  des  documents  inspirée  d’une  semblable  publiée  par  la  Stanford   University   Libraries,   et   qui   incorpore   le   formatage   en date internationale dans un but de clarté. Ils ont utilisé le champ d’appellation pour un nom plus descriptif, afin d’indiquer à l’utilisateur la substance du document.

 

 Image 6 : Un exemple de documents qui ont des noms de fichiers courts mais des titres descriptifs.

 

Victoires de SharePoint

Le FNS a  finalement  mis  SharePoint  à  disposition  des  utilisateurs  finaux  en  début  20186  et  ils  sont présentement  impliqués  dans  des  sessions  de  formation  de  groupe  et  des  activités  de soutien individuelles avec des membres du personnel, de la direction et les  consultants.  Par  l’entremise d’une série de déjeuners-conférences, ils  ont  familiarisé  les  utilisateurs  avec  les  éléments structuraux de l’implémentation pour apprendre à connaitre la plateforme. Le FNS tient également des groupes de formation plus restreints, axés sur de différentes interactions avec le contenu, telles que la navigation du site, le travail dans le cadre d’une collection en utilisant la filtration avec les colonnes, la recherche locale, et avancer d’une page afin de visionner des résultats.

Recherche

La fonction de recherche de SharePoint a été sans contredit la plus grande victoire en termes de ralliement du personnel. Elle permet une recherche de contenu beaucoup plus efficace que les outils de recherche disponibles précédemment sur le bureau, réalisant une économie de temps et beaucoup moins de frustration pour le personnel.

La fonction de recherche de SharePoint améliore également le travail dans le cas de contenu n’ayant pas encore été migré à SharePoint ou qui ne sera pas migré, puisque qu’elle a été configurée pour indexer les disques partagés, afin de permettre aux utilisateurs de rechercher du contenu à travers toutes les collections numériques de façon plus efficace. La recherche SharePoint améliore la recherche courante en recherchant non seulement le titre et les métadonnées du document, mais aussi le contenu lui-même du document, comme une recherche par mot-clé. Par contre, il y a une restriction : les résultats de la recherche d’un disque partagé ne peuvent être consultés que par des utilisateurs au sein du réseau de bureau.

 

 Image 7 : Un exemple de recherche, montrant que le contenu de SharePoint et de File Shares est affiché, autant que le contenu des documents, est perquisitionné.

 

Accès hors site

Le FNS désire que les utilisateurs puissent interagir avec toutes leurs ressources peu importe où ils sont — les membres de la direction sont rarement à leurs bureaux, et les membres du personnel sont souvent en réunion hors site avec leurs mandants ou des organisations partenaires.

C’est pourquoi ils ont créé une station de travail externe ciblée pour l’accès extérieur. Les utilisateurs hors site y accèdent en ouvrant une session à distance avec la station de travail désignée par le réseau de bureau pour utiliser la fonction de recherche de SharePoint lors de leur absence du bureau. Le contenu du disque partagé peut alors être identifié et téléversé à SharePoint pendant la session à distance afin de le rendre disponible à tous les utilisateurs lors d’une session de navigation générique.

Les conseillers juridiques du FNS ont été particulièrement actifs  en  termes  de  sessions  à  distance pour rechercher et obtenir  des  documents  appropriés  de  façon  impromptue,  sans  avoir  à  impliquer le personnel. Ils apprennent présentement à téléverser du contenu créé pour le compte du FNS.

Outils de recherche

Le FNS a aussi récemment utilisé l’outil de recherche Information Query de Collabware et la folksonomie « FNS Topic » courante afin de présenter les résultats de recherche en une liste unique alors que ces documents sont en divers endroits au travers des collections SharePoint. Cet outil de recherche a été extrêmement utile pour l’aide apportée au FNS visant à améliorer leur habileté à répondre aux demandes des conseillers juridiques et à participer aux recherches.

 

 Image 8 : Un exemple d’utilisation de la fonction Information Query pour obtenir une liste de contenu relatif à un sujet spécifique, peu importe où se trouve ce contenu.

 

L’avenir du FNS

Le FNS procède en ce moment  au développement de la fonctionnalité collaborative de  SharePoint. De nouveaux membres du personnel et des conseillers juridiques ont demandé la création de sites de collaboration dans le but de faciliter le travail au cours  de  diverses  activités,  incluant  des  sites pour groupes de travail, et comités et activités de recherche. Ils espèrent que SharePoint continuera d’évoluer pour aborder les nouveaux problèmes, besoins et capacités au fur et à mesure que les  logiciels de Microsoft évoluent également. Le FNS est conscient du développement du nouveau SharePoint 2019 et considérera faire la mise à niveau lors de sa sortie  éventuelle  —  toutefois  pas avant que les pépins et contretemps aient été réglés.

 

Works Cited

1Cet article était originalement présenté par les auteurs lors d’une conférence d’ARMA Canada, en mai 2018, à Vancouver, C.- B.
2Tous les achats de SharePoint Server et licences d’utilisation CALs (client access licences) ont été réalisés grâce à TechSoupCanada.ca qui offre des prix spécialement réduits pour les organisations sans but lucratif et de charité.
3Le Commissariat à la protection de la vie privée du Canada suggère des liens utiles pour le Cloud computing et la vie privée, par contre, l’article le plus récent y figurant date de juin 2012 et est beaucoup trop vieux pour être considéré d’actualité, vu le rythme rapide de changement en technologie. Les questions de propriété et de juridiction des données stockées dans le cloud n’y sont pas mentionnées.
4Il est important de noter que nous avons implémenté un délai d’un an pour la déclaration d’un document en tant qu’archive afin de permettre des modifications additionnelles ou corrections d’erreurs sur un document.
5Le FNS a mis en place une colonne, « True Document Date » pour indiquer la date réelle du document, telle qu’elle est enregistrée sur le document lui-même, par opposition à la date appliquée automatiquement lors du téléchargement du document. Ceci est particulièrement important pour aider à identifier les documents historiques qui ont été téléchargés souvent des années après leur création ou leur réception.
6Il est important de noter que le processus de conversion de documents a été substantiellement long, amenant à un délai significatif entre l’implémentation et le lancement.

A Record of Service: The First Canadian Conference on Records Management “A Giant Success”1

 

By Sue Rock, CRM

 

In 1977 ARMA International conducted a vote among all Canadian members to determine if a separate Canadian Region should be formed. The result was an overwhelming response favouring the formation called Region VIII, now known as ARMA Canada.

In the winter of 1980, Region VIII hosted its first conference. It was a startling, yet not unpredictable success. After all, Canadian records managers had served prominently within ARMA International leadership throughout the ‘70s. Many had presented papers at ARMA International conferences, and had received prestigious International awards. They knew how to market records management and how to serve an eager audience. Their goal was to showcase what Canadians can accomplish despite geographical and cultural differences.

The Conference theme was “Sea to Sea Information”. It was held from February 4 – 6, 1980 in Banff, Alberta.

The final registration of the First Canadian Conference on Records Management totalled 252 participants with representation both in speakers and general attendance from every province and territory except Prince Edward Island.

Over a three-day period, the conference offered a staggering 42 educational sessions, ranging in duration from 1.5 hours to full day sessions. Not one speaker received expense money from conference funds and some speakers came at their own expense. Organizations who defrayed the expenses of their employees to attend were recognized as an indication of the importance of records management as the discipline entered the 1980s.

First Conference 1980 Educational Content and Outcomes

The Conference received “super-positive feedback”1 from the participants on the quality of speakers. As a result, a Canadian Speakers Roster was developed by Judi Harvey, Vancouver Chapter, to be provided to all chapters in Region VIII as well as the ARMA Speakers Bureau.

The sessions which received the highest ratings are equally sustainable from our 21st century perspective. Though we may find references to microfilm as ‘quaint’, remember, the Standards Council of Canada had recently approved the new National Standard CAN-CGSB-72.11-M79 Microfilm as Documentary Evidence which remained a solid reference tool right into the 21st century.

Sessions highly rated included: Fundamentals of Records Management; Microfilm vs Paper; Systems Approach to Records Management; Records Centres; Micrographics Equipment Review; Directive Management; Effective Writing; Computer Output Microfilm; Resistance to Change; Word Process; Cost Benefit Analysis; Form Management; Implementation of Records Management Programs; Interface between Libraries and Records Management and Special Interest Group on Petroleum.2

The keynote speaker at the first day’s luncheon was Dr. W. I. Smith, Dominion Archivist, Public Archives of Canada. His topic was: Records Management in Canada: Challenge and Opportunity. Day two featured Ms. Anneliese Arenburg, President, Arenburg Consultants Ltd. speaking on Records Management Future: A consultant’s point of view.

Much like our current “Poster Board” conference sessions, the First Conference offered a “Call for Papers”, with a 10-minute on-site soundbite per author. These 10-minute sessions permitted any participant to share an informative management project or system that had been successfully implemented in their organization. The paper would then be published in the conference proceedings.

First Conference Chapter Leadership Outcomes

In the background of Conferences, ARMA leadership teams get together to exchange ideas and develop a cohesive strategic direction. Noteworthy is that all Chapter Presidents representing the seven Chapters: Calgary, Edmonton, Montreal, Ottawa, Toronto, Vancouver, and Winnipeg, were in attendance, which led to a very constructive Chapter Presidents’ Meeting.

At this First Conference, Gerry Roussy of Fredericton, New Brunswick coordinated a meeting of individual Chapter Education Directors. The end result was that the Canadian Position Paper on Records Management Education would be authored by Gerry, based on feedback received from each Chapter. Records management educational circulars and course outlines from each Chapter’s geographic location were shared at the meeting.

A legacy from the First Conference was that the Chapter Presidents recommended a Canadian Region Conference be held every two years and should be rotated East, Central, and West. [Ironically, the theme of the First Conference was “Sea to Sea Information” – note, Maritime Chapters did not exist at the time.] As an immediate result, Winnipeg, Toronto and Montreal indicated interest in hosting the 1982 Region VIII Conference. In 1982, the second Canadian Conference was held in Montreal, with presentations and proceedings in both official languages.

First Conference Awards and Recognition

One of the highlights of the Conference was the presentation of the William J. Gray Award3 for research and development in the field of Information Management which was presented to the two main forces in the Canadian Records Retention Project, Ted Hnatiuk and Jim Coulson of Toronto. Certificates of appreciation were awarded to Peg Barlow and Stephen Pollishuke who contributed a great deal to the project.

The Canadian Records Retention Project was also known by its acronym SCORR – the Special Committee on Records Retention, chaired by Ted Hnatiuk with Carl Weise and Jim Coulson, and all members of the ARMA Toronto Chapter.4 With participation from all Canadian ARMA Chapters, SCORR was successful in petitioning the Federal government to produce a definitive list of Federal regulatory retention requirements. The first “red book” of Federal records retention regulatory requirements in Canada was published in 1980. It addressed 27 Federal departments with 74 Federal Statutes, 111 Federal Regulations with 67 different retention periods placed on business records. Approximately 50% of the regulations did not contain stated retention periods.

The Infomatics Award for furtherance of interfacing with other disciplines within the field of Information Management was presented to Patricia Schick, Edmonton Chapter (cited in many ARMA Journals under her subsequent name Patricia Acton, and future president of the Institute of Certified Records Managers (ICRM). This award was first sponsored in 1978 by the ARMA Edmonton Chapter, intended to be presented annually to a “deserving person making outstanding contributions in the information processing field”.5 The award description includes “computing and systems” as one of the three primary areas of information and records management at the time.

ARMA Canadian Pioneers comprising founding Chapter Presidents, Former Regional Vice Presidents, ARMA International Officers and Charter Members who had remained active in ARMA for a number of years were honoured at the Conference. The following pioneers were presented with Certificates of Appreciation at a special Pioneer Luncheon:

Calgary: James C. Carrow; F. Whiteley Graham; Donald E. Worden
Edmonton: Robert P. Morin CRM; Sandra Ercse; Howard Brinton
Montreal: Denis Deslongchamps CRM and former Regional VP; William Gray CRM; John Andreassen CRM; Margaret Sadler former ARMA International Secretary; George Lawson; Clifford Wynn
Ottawa: Ralph Westington; Milton Thwaites; Ted Ferrier; John Smith future ARMA International President
Toronto: D.V. Andres CRM; A. Arenburg CRM; D.T. Barber CRM and former Regional VP; J.W. Champers; P. Collins; P.M. Fraser; G. Gammie; W.J. Hardy; J.T. Harrison; G.J. Joyce; W.F. Ludlam CRM; H.A. Moulds; G.B. Mowat CRM; A.W. Murdoch; V. Niggl CRM; G.M. Pratt; J.E. Thorne; K.A. Wadwell; R.L. Webb; R.A.H. Westmore; G. Fletcher former Regional VP
Vancouver: Judi Harvey; Pam Neish; Olive J. Pennock; Anthony P. Farr; Harry C. Chapin; Jacqueline Rowland; Harold A. Findlater; Edith Laurenda Daniells
Winnipeg: Terry Smythe; Ken Smith; J. A. Spokes, future ARMA International President

First Conference Networking Opportunities

The Conference was hosted in Banff, Alberta mid-winter, February 4 – 6, 1980. The logistics of shuttling participants to and from the location must’ve been staggering. However, the logistics would have provided an opportunity for immediate networking!

Here’s a snapshot of the Guest Spouse Program, at an additional cost of $55.00 for the three-day tour and activities. How about that “romantic bus ride to Elkana Ranch”?

 

 

Conferences are often fondly remembered for their social networking opportunities. For seasoned ARMA Canada conference participants, many will fondly recall outings often disrupted by the unexpected, such as a moose halting a convoy to a picnic location at the 1999 event in Saskatoon!

The First Canadian Conference 1980 offered the usual Sunday evening welcoming cocktail party, followed by a Monday wine and cheese party, with the Tuesday Night at Elkana Ranch as the grande soirée for a fine dinner, requisite speeches and awards, followed by an evening to schmooze.

As a review of the First Canadian Conference 1980 draws to a close…

The delivery of educational opportunities remains a cornerstone of ARMA activity. Conferences fulfill ARMA educational objectives by presenting au current information-related topics to an eager learning audience. Networking at Conferences is an unmeasurable deliverable for ARMA, but one can confidently conclude that sharing and exchanging experience and information related to the field of records and information management occurs from the moment one registers for a conference.

Most of us head off to the annual ARMA Canada Conference happy to be a participant and receive thoughtfully orchestrated education. When one considers the responsibilities that a Conference team assumes, it becomes apparent that Conference success is achieved through a devotion of the committee members’ personal time, effort prior and during the conference.

Annual ARMA Canada conferences are a tribute to the pioneers who organized the First Canadian Conference on Records Management in 1980. Imagine how they celebrated the obvious success of their investment! They created the path for us to continue to develop and advance standards of professional competence in the field of records and information management.

Interviewer: In spite of all the difficulties in records management, why do you remain in the field?
King Lear: I am tied to the stake, and I must stand the course.6

 

Works Cited

1Canadian Notes by Don H. Bell, ARMA Quarterly April 1980 p52
2Ibid
3William J. Gray, CRM ARMA Montreal Chapter; author – “Correspondence Course in Systems and Procedures” ARMA Quarterly October 1980 p 20
4Jim Coulson, CRM, FAI, author – “Some Personal Reflections on the History of RIM in Canada”, Sagesse Spring 2016 Vol. 1 Issue 1
5ARMA Quarterly April 1979 p 48
6William Benedon Interviews William Shakespeare on Records Management, ARMA Quarterly January 1980 p 30

Winter 2018 Edition

Biographies

 

Sharon Byrch is an experienced local government records and privacy manager, working for the Capital Regional District in Victoria, BC. Her mission is helping government digitize and overcome information chaos by optimizing valuable information assets as a ‘single source of managed truth’ through applied IM/ERM strategy, learning and effective partnerships.

Uta Fox, CRM, FAI is the Manager of Records and Evidence Section, Calgary Police Service. She holds a Master’s Degree from the University of Calgary, is a Certified Records Manager and in 2017, made a Fellow of ARMA International – #55. Uta represented ARMA Canada on the Canadian General Standards Board Committee that updated CAN/CGSB 72.34-2017 Electronic records and documentary evidence.

Carolyn Heald is Director, University Records Management and Chief Privacy Officer at Queen’s University in Kingston, Ontario.  She started her career as an archivist at the Archives of Ontario before moving into policy and records management.  Later, she took on both privacy and records management at York University in Toronto.  She holds an MA in History and a Master’s of Library Science, and is a Certified Information Privacy Professional (CIPP/C) from the International Association of Privacy Professionals.

 Stuart Rennie, JD, MLIS, BA (Hons.) has a Vancouver, Canada-based boutique law practice where he specializes in records management, privacy and freedom of information, law reform, public policy and information governance law. He is a member of ARMA International’s Content Editorial Board. Stuart is also an adjunct professor at the School of Library, Archival and Information Studies at the University of British Columbia in Vancouver.

Sue Rock, CRM, is owner of The Rockfiles Inc. and an operating partner of Trepanier Rock®, with a focus on ensuring records management principles are embedded into all information management solutions. She is an active supporter of ARMA. Sue influences friends and colleagues with her enthusiasm for  historical content captured in ARMA’s IM journals.

 Joy Rowe is the Director of Records Management Compliance for a private sector company in metro Vancouver, and she previously worked as the Records Management Archivist at Simon Fraser University. She is a graduate of the School of Library, Archival, and Information Studies at University of British Columbia. She has published articles and presented on topics such as openly-licensed training tools for records creators, accessibility and universal design, and human rights and records. She was one of six selected from an international pool to be part of the 2017-2018 New Professionals Programme of the International Council on Archives.

Sagesse: Journal of Canadian Records and Information Management an ARMA Canada Publication Winter, 2018 Volume III, Issue I

  

Introduction

ARMA Canada is pleased to announce the launch of its third issue of Sagesse: Journal of Canadian Records and Information Management, an ARMA Canada publication.

This latest edition features a biography of an ARMA Canada pioneer, a thorough review of metadata, a discussion of the Canadian General Standards Board updated CAN/CGSB 72.34-2017 Electronic records as documentary evidence and a look at privacy challenges at Ontario’s universities in regards to records management.

One of the goals of Sagesse’s editorial team is to honour and bring forward former Canadian members involved with ARMA Canada / ARMA International who have made significant contributions to ARMA and the records and information management (RIM) and information governance (IG) industries. Sue Rock, CRM, accepted this challenge and provided a biography of Bob Morin, CRM. Bob was a founding member of ARMA Canada Region, a founding member of three of ARMA Canada’s chapters and a champion of ARMA International. To learn more about his tremendous achievements, see “A Record of

Service: Bob Morin, CRM,” in this issue. This article has been translated into French.

“Metadata in Canadian case law,” by Joy Rowe is a thorough exploration of metadata and its value focusing on admissibility as well as records identification, retrieval and assisting in the authenticity and integrity of records systems. As well, the article examines case law including those related to the business records exception to the hearsay rule, the best evidence rule and evidence weight, use of standards in authentication and computer-generated versus human-generated electronic records.

And in keeping with the admissibility of electronic records, Sharon Byrch, Uta Fox, CRM, FAI and Stuart

Rennie collaborated to write “The New CAN/CGSB 72.34-2017 Electronic records as documentary

evidence.” They discuss the major changes made from the 2005 version to that of 2017 in regards to legal issues, records management and information technology. The authors wish to thank the Canadian General Standards Board for reviewing this article. Please note: this standard is now available in both French and English and can be obtained at no cost on their website (see the article for the links). This article has also been translated into French.

Carolyn Herald examines “Privacy-Driven RIM in Ontario’s Universities” and the role of privacy as a means of implementing records management in Ontario’s universities. Ontario’s Freedom of Information and Protection of Privacy Act (FIPPA) was amended in 2014 and since then more and more universities have established records management programs because of the recordkeeping amendments. This article is relevant to RIM/IG practitioners in all other industries.

We would like to take this opportunity of thanking you, the RIM and IG professionals that access and use the information in these issues in your workplaces and for education and / or training purposes.

Sagesse’s articles have been downloaded from hundreds of times to over 18,000 downloads. That means this information is relevant, useful, practical and appreciated; ARMA Canada is most pleased to be able to provide this content for you.

Please note the disclaimer at the end of this Introduction which notes that the opinions expressed by the authors are not the opinions of ARMA Canada or the editorial committee. Whether you agree or not with this content, or have other thoughts about it, we urge you to share them with us. If you have recommendations about the publication we would appreciate receiving them. Forward opinions and comments to: armacanadacancondirector@gmail.com.

If you are interested in providing an article for Sagesse, or wish to obtain more information on writing for Sagesse, visit the ARMA Canada’s website – www.armacanada.org – see Sagesse.

Enjoy!

 

ARMA Canada’s Sagesse’s Editorial Review Committee:

Christine Ardern, CRM, FAI; John Bolton; Alexandra (Sandie) Bradley, CRM, FAI; Stuart Rennie and Uta Fox, CRM, FAI, Director of Canadian Content.

 

Disclaimer

The contents of material published on the ARMA Canada website are for general information purposes only and are not intended to provide legal advice or opinion of any kind. The contents of this publication should not be relied upon. The contents of this publication should not be seen as a substitute for obtaining competent legal counsel or advice or other professional advice. If legal advice or counsel or other professional advice is required, the services of a competent professional person should be sought.

While ARMA Canada has made reasonable efforts to ensure that the contents of this publication are accurate, ARMA Canada does not warrant or guarantee the accuracy, currency or completeness of the contents of this publication. Opinions of authors of material published on the ARMA Canada website are not an endorsement by ARMA Canada or ARMA International and do not necessarily reflect the opinion or policy of ARMA Canada or ARMA International.

ARMA Canada expressly disclaims all representations, warranties, conditions and endorsements. In no event shall ARMA Canada, it directors, agents, consultants or employees be liable for any loss, damages or costs whatsoever, including (without limiting the generality of the foregoing) any direct, indirect, punitive, special, exemplary or consequential damages arising from, or in connection to, any use of any of the contents of this publication.

Material published on the ARMA Canada website may contain links to other websites. These links to other websites are not under the control of ARMA Canada and are merely provided solely for the convenience of users. ARMA Canada assumes no responsibility or guarantee for the accuracy or legality of material published on these other websites. ARMA Canada does not endorse these other websites or the material published there.

A Record of Service: Bob Morin, CRM

By Sue Rock, CRM

 

Introduction

 A Record of Service: Bob Morin, CRM coincides with the 150th anniversary of Canada as a country. Bob is one of many Canadian records management pioneers who persevered in establishing records management as a profession.

This tribute focuses on Bob’s achievements within the context of professional organizations such as ARMA and ICRM because Bob believed in the need for common organization.

Bob Morin, CRM died on February 17, 2012 in Saskatoon, Saskatchewan. Bob was a records management pioneer in the truest sense. Remembered fondly for his personality and humour, Bob carved his career through vision, determination, strength and perseverance. He found inspiration from his ARMA International colleagues both in Canada and the US. He sought to measure his records management competency by receiving his CRM designation. He demonstrated continuous dedication to industry associations such as ARMA and ICRM wherein he formed life-long professional partnerships and a multitude of personal friendships.

 

ARMA – How contributions are measured and documented for enduring historical perspective

ARMA employs a system of categories comprising leadership, awards, education, publication and presentations to recognize individual efforts to advance the profession of records management. ARMA membership provides opportunities for individual professional growth, and measures success through these hallmarks. Achievements are documented within ARMA’s historic and present journals.

Bob’s contributions to the profession of records management are documented within ARMA International’s archival collection of journals. The journals, currently known as Information

Management, were previously entitled ARMA Records Management Quarterly commencing 1967, and as the title states, they were published four times per year.

Through his membership as documented within professional organizations, such as ARMA and the ICRM, we begin to derive an understanding of Bob’s personal values, his job involvement, his active participation in decision making, and his motivation – the sphere of influence he created.

 

ARMA – Leadership and the importance of vision

 In 1978 the ARMA Canada Region, designated at the time as “Chapter VIII” was inaugurated. This milestone was achieved through a devotion of time and effort by founding Canadian Chapter Presidents including Bob Morin, Former Regional Vice Presidents, International Officers and Charter Members who remained active in ARMA for a number of years. We celebrate our current Canada Region status in ARMA International as an outcome of their vision and leadership.

At the same time, Bob represented Canada as voting member Regional Vice President 1978-80 on the Board of Directors, ARMA International. ARMA documents that as Region VIII nominee National Office 1978-1979 “Mr. Morin’s education ranges from archival management to computer sciences.”

Bob focused on the role of technology for managing records. The collective term at the time for individual technologies such as microcomputers, FAX machines, and micrographic equipment, was office automation.

The challenge of rapidly emerging technology solutions to records management problems during this era is strikingly similar to those faced in the 21st century. Identification of the role of technology within Bob’s career is paramount to the direction he championed.

By 1980, Bob had already established two ARMA Chapters within Canada: “He was the founder and first president of the Ottawa chapter…and repeated the same organizational effort recently when he moved to Edmonton.”2 On May 2, 1983 the Saskatchewan Chapter was chartered, with Bob as its President, flanked by a stellar ARMA Chapter team.3

 

 

ARMA – Awards and recognition

On the basis of his championing the role of technology within records management, Bob received the first Infomatics Award in 1978 for his efforts in implementing a records management program for the Government of Alberta. This achievement award was sponsored by the ARMA Edmonton Chapter, intended to be presented annually to a “deserving person making outstanding contributions in the information processing field.”4 The award description includes “computing and systems” as one of the three primary areas of information and records management at the time.

The first Canadian Conference on Records Management recruited 252 participants to Banff, Alberta February 4-6, 1980. The conference honoured the “ARMA Canadian Pioneers”. Of course Bob was among them! The group comprised “…founding Chapter Presidents, Former Regional Vice Presidents, International Officers and Charter Members who … remained active in ARMA for a number of years … These pioneers were presented with Certificates of Appreciation at a Pioneer Luncheon at the Banff conference, February 6th, 1980.” The pioneers are named, and they represent seven Canadian Chapters at the time: Montreal, Ottawa, Toronto, Winnipeg, Calgary, Edmonton, and Vancouver. 5

An excerpt from one of Bob’s colleagues, Jim Coulson, CRM, FAI breathes life into Bob-the-person with regard to the first Canadian Conference: “Bob Morin, of Saskatoon, was a passionate RIM pioneer who brought records management professionals together from across the country to put on the first Canadian Records and Information conference. Two years later, the conference was held in Montreal, with presentations and proceedings in both official languages. The ARMA Canada conference has since become a highly respected annual event.”6

 

ARMA – Publication and presentation – hallmarks of contribution

Bob shared his on-the-job experience with the ARMA community through both publication and presentation. These communication channels are highly regarded within the ARMA community.

An article Bob wrote for the The Records Management Quarterly in 1980 entitled “Management Information Systems – a Total Approach” speaks to computer literacy as a continuing theme in his career. The article is not only informs us of the technology components of records management at the time, but also states the importance of integrating the technologies and disciplines: “The approach stresses the need for analysis of clerical systems in the design stages. Cost effectiveness is illustrated by efficient use of all information technologies such as micrographics, word processing, forms design, data processing and library resources.”7

ARMA published an earlier article by Bob in 1976 “Relocating a Records Management Operation”.8 In this article, Bob’s thesis is that the records manager will become involved in office moves due to “his interest in all aspects of paperwork”. These moves vary from a simple re-arrangement to a complete relocation. It provides a step-by-step plan covering both the preparation and the physical movebrequirements of particular value to the records manager. This topic continues to weigh into a records manager’s domain in the 21st century!

In 1983, Bob shared his knowledge in a presentation at the 28th Annual Conference, where the conference theme was “The Emerging Information Economy”. His topic was “What the Records Manager Should Know about Data Processing Systems Documentation”. The synopsis states “…the proper handling of magnetic tapes…How to schedule data processing records such as magnetic tapes, databases, and diskettes will be discussed.”9

 

ARMA – Promotion of education for records managers

 Another cornerstone of ARMA professional activity is the delivery of educational programs. Bob was a leader in this area, too.

While Director of Records Management for the Alberta Government Services from 1975-82, “…he directed an eleven-course Training Program that covered File Management, Records Systems,

Micrographics, Office Automation, Forms Management, and Data Processing…He taught Records Management at the University of Alberta, Faculty of Library Science, where he was instrumental in establishing a Credit Course in Records Management.”10

It has been said by his students and colleagues that Bob was fierce in generosity to help fledglings. He always raised the collective morale of those around him.

As a memorial, the ARMA Saskatchewan Chapter continues to present the Bob Morin Award to a young professional to financially assist with training in the Records Management field.

From ARMA’s checklist for professional status including awards, education, publication, presentation, and sphere of influence, Bob hit the highlights. For any of us who have served in any volunteer capacity one might ponder: “Where did he get the energy?” He didn’t stop with ARMA!

 

IRMF – International Records Management Federation

 In 1978, Bob was named as one of ARMA International’s two delegates to a professional organization called the International Records Management Federation (IRMF).11

Initially, the Federation was designed to serve a world-wide interest that Records Management was experiencing. The The Records Management Quarterly published a column called Outlook for the Federation to share its goals, membership bragging rights, and achievements. “Records Management is no longer an all-American sport although many of we ‘internationals’ still gaze in awe at ARMA – 4800 plus strong and still growing, and what a Management team!”12

By 1982, Bob had graduated to the Federation President’s leadership role.13 A position paper examining the future of the Federation was presented the same year to ARMA International. As a result, a new constitution was reviewed and approved by the Executive Board of the IRMF. It was renamed to International Records Management Council.14 Another “first” for Bob.

 

ICRM – Institute of Certified Records Managers

 Bob went on to assume leadership in and to share his sphere of personal influence with the ICRM — the Institute of Certified Records Managers. He served as an officer of the Board of Regents for the term January 1986 through December 1988.

 

In conclusion …

Bob is one of many colourful Canadian records management pioneers. Their adaptability, their determination and their drive to achieve have created a remarkable example of excellence in the Canadian records management business scene.

It’s a particular challenge to those of us who remain, or who are new to the profession, to pause, to reflect on their journey. We are in our own records management career hiatus – forging our way, while learning time management skills to prioritize and succeed.

Time for reflection? Consider it a professional vacation. A few minutes to lean backwards into a previous century will reap the reward of inspiration and awe for those who faced very similar circumstances: the influence of society, politics, world events, and, always, the advance of technology.

 

Bibliography

1”Association News and Events”, ARMA Records Management Quarterly, 12, Number 3 (1978), 51-52
2”Association News and Events”, ARMA Records Management Quarterly, 12, Number 3 (1978), 51-52
3”Association News and Events”, ARMA Records Management Quarterly, 17, Number 3 (1983), 56 4David H. Bell, “Canadian Notes”, ARMA Records Management Quarterly, 13, Number 2 (1979), 48 5Don H. Bell, “Canadian Notes”, ARMA Records Management Quarterly, 14, Number 2 (1980), 52 6Jim Coulson, “Some Personal Reflections on the History of RIM in Canada”, 2017
7Robert P. Morin, “Management Information Systems – A Total Approach”, ARMA Records Management Quarterly, 14, Number 2 (1980) 5-6, 16
8”Records Management Quarterly Cumulative Index of Articles (Annotated) January 1967 – October
1977”, ARMA Records Management Quarterly, 11, Number 4 (1977), 63
9”28th Annual Conference ARMA The Emerging Information Economy”, ARMA Records Management Quarterly, 17, Number 3 (1983) 61
10”ICRM Information”, ARMA Records Management Quarterly, 19, Number 4 (1985), 62
11”Outlook – International Records Management Federation”, ARMA Records Management Quarterly, 12, Number 2 (1978), 55
12Outlook – International Records Management Federation”, ARMA Records Management Quarterly, 12, Number 2 (1978), 55
13ARMA Records Management Quarterly April 1982 page 69
14IRMF Reorganizes: Announces New Publications”, ARMA Records Management Quarterly, 16, Number 2 (1982)


1An emphasis on computer sciences became a key ingredient to Bob’s career in records management.
2ARMA Records Management Quarterly July 1978 pages 51-52
3ARMA Records Management Quarterly July 1983 page 56
4ARMA Records Management Quarterly April 1979 page 48
5ARMA Records Management Quarterly April 1980 page 52
6Some Personal Reflections on the History of RIM in Canada, Jim Coulson, CRM, FAI
7ARMA Records Management Quarterly April 1980 page 5
8ARMA Records Management Quarterly October 1977 page 63
9ARMA Records Management Quarterly July 1983 page 61
10ARMA Records Management Quarterly October 1985 page 62
11ARMA Records Management Quarterly April 1978 page 55
12ARMA Records Management Quarterly April 1978 page 55

Un récit de service: Bob Morin, CRM

par Sue Rock, CRM

 

Introduction

Un récit de service: Bob Morin, CRM, coïncide avec le 150e anniversaire du Canada en tant que pays. Bob est l’un des nombreux pionniers de la gestion des documents au Canada qui ont persévéré pour l’établissement de la gestion des documents en tant que profession.

Cet hommage se concentre sur les réalisations de Bob dans le contexte d’organisations professionnelles, telles que l’ARMA et l’ICRM, car Bob croyait en la nécessité d’une organisation commune.

Bob Morin, CRM, est décédé le 17 février 2012 à Saskatoon, en Saskatchewan. Bob était un pionnier de la gestion des documents dans le vrai sens du terme. Commémoré affectueusement pour sa personnalité et son humour, Bob a façonné sa carrière avec vision, détermination, force et persévérance. Il trouvait son inspiration parmi ses collègues d’ARMA International, au Canada et aux États-Unis. Il a cherché à mesurer sa compétence en gestion des documents en recevant son titre CRM. Il a fait preuve d’un dévouement constant envers les associations de l’industrie telles que l’ARMA et l’ICRM, où il a formé des partenariats professionnels à vie et une multitude d’amitiés personnelles.

 

ARMA – Comment les contributions sont mesurées et documentées pour créer une perspective historique durable

ARMA emploie un système de catégories comprenant le leadership, les prix, l’éducation, la publication et les présentations afin de reconnaître les efforts individuels pour faire progresser la profession de gestion des documents. L’adhésion à l’ARMA offre des opportunités de croissance professionnelle aux individus et mesure leur succès grâce à ces caractéristiques. Les réalisations sont documentées dans les revues passées et présentes de l’ARMA.

Les contributions de Bob à la profession de gestion des documents sont documentées dans la collection de revues d’archives d’ARMA International. Les revues, actuellement connues sous le nom de Information Management (Gestion de l’information), étaient auparavant intitulées ARMA Records Management Quarterly (Gestion de documents trimestrielle d’ARMA) à partir de 1967, et comme l’indique le titre, étaient publiées quatre fois par an.

Grâce à son appartenance à des organisations professionnelles telles que l’ARMA et l’ICRM, nous commençons à comprendre les valeurs personnelles de Bob, son implication professionnelle, sa participation active à la prise de décision et sa motivation – la sphère d’influence qu’il a créée.

 

ARMA – Leadership et importance de la vision

En 1978, la Région canadienne de l’ARMA, désignée à l’époque sous le nom de « Chapitre VIII », a été inaugurée. Ce jalon a été atteint grâce au dévouement en temps et en détermination dont ont fait preuve les présidents de sections canadiennes, tels que Bob Morin, les anciens vice-présidents régionaux, les dirigeants internationaux et les membres fondateurs, qui sont demeurés actifs au sein de l’ARMA pendant plusieurs années. Nous célébrons le statut actuel de notre région dans l’ARMA International grâce à leur vision et leur leadership.

Au même moment, Bob représentait le Canada comme membre votant au poste de vice-président régional 1978-80 au conseil d’administration d’ARMA International. L’ARMA documente que, en qualité de candidat à la Région VIII, Bureau national 1978-1979 « La formation de M. Morin s’étend de la gestion des archives à l’informatique. » 1 L’accent mis sur les sciences informatiques est devenu un élément clé du succès de la carrière de Bob dans la gestion des documents.

Bob s’est concentré sur le rôle de la technologie dans la gestion des documents. Le terme général de l’époque pour les technologies individuelles, telles que les micro-ordinateurs, les télécopieurs et l’équipement micrographique, était la bureautique.

Le défi que représentaient les solutions technologiques émergentes face aux problèmes de gestion des documents de l’époque est étonnamment semblable à celui du XXIe siècle. L’identification du rôle de la technologie dans la carrière de Bob est primordiale dans la direction qu’il défend.

Dès 1980, Bob avait déjà établi deux sections d’ARMA au Canada: « Il a été le fondateur et le premier président de la section d’Ottawa…et a récemment répété les mêmes efforts organisationnels lorsqu’il a déménagé à Edmonton. 2 » Le 2 mai 1983, la section de la Saskatchewan a été agréée, avec Bob en tant que président, encadré par une excellente équipe de la section d’ARMA. 3

 

 

ARMA – Prix et reconnaissance

En promulguant le rôle de la technologie dans la gestion des documents, Bob a remporté le premier Prix de l’informatique en 1978 pour ses efforts dans la mise en œuvre du programme de gestion des documents du gouvernement de l’Alberta. Ce prix d’excellence a été parrainé par la section ARMA d’Edmonton, et devait être présenté annuellement à la « personne méritante apportant des contributions exceptionnelles dans le domaine du traitement de l’information ». La description du prix inclut « informatique et systèmes » comme l’un des trois principaux domaines de gestion d’information et de documents de l’époque. 4

La première conférence canadienne sur la Gestion des documents a accueilli 252 participants à Banff, en Alberta, du 4 au 6 février 1980. La conférence a honoré les « Pionniers canadiens ARMA ». Bien sûr, Bob était parmi eux! Le groupe était constitué de « …présidents fondateurs de section, anciens vice-

présidents régionaux, officiels internationaux et membres fondateurs qui…sont restés actifs dans l’Arma pendant plusieurs années… Des certificats d’appréciation leur ont été décernés au Déjeuner des Pionniers, à la conférence de Banff, le 6 février 1980. » Les pionniers sont nommés et ils représentent sept sections canadiennes de l’époque : Montréal, Ottawa, Toronto, Winnipeg, Calgary, Edmonton et Vancouver. 5

Une citation d’un des collègues de Bob, Jim Coulson, CRM, FAI, insuffle de la vie au personnage, au moment de la première Conférence canadienne: « Bob Morin, de Saskatoon, a été l’un des pionniers passionnés de CRI, qui réunit des professionnels de gestion de documents provenant des quatre coins du pays, et a organisé la première Conférence sur les documents et l’information au Canada. Deux ans plus tard, la conférence a eu lieu à Montréal, avec des présentations et des procédures dans les deux langues officielles. La conférence d’ARMA Canada est depuis lors devenue un événement annuel hautement respecté. » 6

 

ARMA – Publication et présentation – emblèmes de la contribution

 Bob a partagé son expérience pratique avec la communauté ARMA grâce aux publications et aux présentations. Ces canaux de communication sont hautement respectés au sein de cette communauté.

Un article que Bob a écrit pour le Records Management Quarterly en 1980, intitulé « Gestion de systèmes d’information – une approche globale », montre comment l’alphabétisation informatique est un thème continu de sa carrière. L’article ne fait pas que nous informer sur les composantes technologiques de la gestion des archives de l’époque, mais souligne également l’importance de l’intégration des technologies et des disciplines: « L’approche met l’accent sur le besoin d’analyse des systèmes cléricaux dans leurs phases de conception. La rentabilité est illustrée par l’utilisation efficace de toutes les technologies de l’information telles que la micrographie, le traitement de texte, la conception de formulaires, le traitement de données et les ressources documentaires. » 7

ARMA a publié un article antérieur de Bob, datant de 1976 et intitulé « Relocating a Records Management Operation. » 8 Dans cet article, la thèse de Bob est que le gestionnaire de documents s’impliquera dans les déménagements de bureaux en raison de « son intérêt pour tous les aspects de la paperasserie ». Ces mouvements vont d’un simple réaménagement à une délocalisation complète. Il fournit un plan, étape par étape, couvrant à la fois les exigences de préparation et de déplacement physique, d’une valeur particulière pour le gestionnaire de documents. Ce sujet continue d’être d’actualité pour les gestionnaires de documents du 21e siècle!

En 1983, Bob a partagé ses connaissances lors d’une présentation à la 28e conférence annuelle, où le thème de la conférence était « L’économie émergente de l’information ». Son sujet était « Ce que le gestionnaire de documents doit savoir sur la documentation des systèmes de traitement de données ». Le synopsis indique que la présentation inclut « …la manipulation correcte des bandes magnétiques…La façon de planifier les enregistrements de traitement de données, tels que les bandes magnétiques, les bases de données et les disquettes, sera discutée. » 9

 

ARMA – Promotion de la formation pour les gestionnaires de documents

Une autre pierre angulaire de l’activité professionnelle de l’ARMA est la prestation de programmes de formation. Bob était aussi un leader dans ce domaine.

De 1975 à 1982, alors qu’il était directeur de la gestion des documents pour le Gouvernement de

l’Alberta, « il a dirigé un programme de formation de onze cours portant sur la gestion de documents, les systèmes de documents, la micrographie, la bureautique, la gestion de formulaires et le traitement de données… Il a enseigné la Gestion de documents à la faculté de bibliothéconomie de l’Université de l’Alberta, où il a joué un rôle déterminant dans l’établissement d’un cours crédité sur la gestion des documents. »10

Ses étudiants et ses collègues ont dit de Bob qu’il était acharné dans sa générosité pour aider les jeunes. Il a toujours soulevé le moral collectif de ceux qui l’entouraient.

En hommage, la section de l’ARMA de la Saskatchewan continue de remettre le prix Bob Morin à un jeune professionnel, pour l’aider financièrement dans le domaine de la gestion des documents.

De la liste de contrôle ARMA pour le statut professionnel, y compris les prix, l’éducation, la publication, la présentation et la sphère d’influence, Bob a souligné les grandes lignes. Pour ceux d’entre nous qui ont servi à titre de bénévoles, on peut se demander: « Où prenait-il son énergie? » Et il ne s’est pas arrêté avec ARMA!

 

IRMF – International Records Management Federation

En 1978, Bob a été choisi pour être l’un des deux délégués d’ARMA International à une organisation professionnelle appelée International Records Management Federation (Fédération internationale de gestion des documents) (IRMF). 11

Au départ, la Fédération a été établie pour servir un intérêt mondial pour la gestion des documents. Le Records Management Quarterly a publié une rubrique intitulée Outlook for the Federation afin de partager ses objectifs, ses motifs de fierté et ses réalisations. « La Gestion de documents n’est plus un sport entièrement américain, bien que plusieurs parmi nous “internationaux” admirent toujours ARMA – fort de plus de 4 800 membres et toujours en augmentation, et quelle équipe de gestion! » 12

En 1982, Bob avait un rôle de leadership pour le président de la Fédération.13 Un document de réflexion sur l’avenir de la Fédération a été présenté la même année à ARMA International. En conséquence, une nouvelle constitution a été examinée et approuvée par le Conseil exécutif du IRMF, qui a été renommé International Records Management Council (IRMC -Conseil International d’Administration des Archives).14 Une autre “première” pour Bob.

 

ICRM – Institute of Certified Records Managers

Bob a continué à assumer le leadership et à utiliser sa sphère d’influence personnelle avec l’ICRM – Institute of Certified Records Managers (Institut des gestionnaires de documents certifiés). Il a été membre du Conseil d’administration de janvier 1986 à décembre 1988.

 

En conclusion …

Bob est l’un des nombreux pionniers de la gestion des documents au Canada. Leur capacité d’adaptation, leur détermination et leur volonté de réussir créent un remarquable exemple d’excellence dans le domaine de la gestion des documents au Canada.

C’est un défi particulier pour ceux d’entre nous qui restent, ou qui sont nouveaux dans la profession, de faire une pause, de réfléchir sur leur parcours. Nous suivons notre propre parcours de carrière en gestion des documents – forgeant notre chemin, tout en acquérant des compétences de gestion de temps pour établir des priorités et réussir.

C’est le temps de réfléchir? Considérez cela comme des vacances professionnelles. Quelques minutes passées à retourner dans le siècle précédent vous donneront comme récompense l’inspiration et de l’admiration pour ceux qui ont fait face à des circonstances très semblables: l’influence de la société, la politique, les événements mondiaux et, toujours, le progrès de la technologie.

 

Bibliographie

1”Association News and Events”, ARMA Records Management Quarterly, 12, Numéro 3 (1978), 51-52
2”Association News and Events”, ARMA Records Management Quarterly, 12, Numéro 3 (1978), 51-52
3”Association News and Events”, ARMA Records Management Quarterly, 17, Numéro 3 (1983), 56 4David H. Bell, “Canadian Notes”, ARMA Records Management Quarterly, 13, Numéro 2 (1979), 48 5Don H. Bell, “Canadian Notes”, ARMA Records Management Quarterly, 14, Numéro 2 (1980), 52 6Jim Coulson, “Some Personal Reflections on the History of RIM in Canada”, 2017
7Robert P. Morin, “Management Information Systems – A Total Approach”, ARMA Records Management Quarterly, 14, Numéro 2 (1980) 5-6, 16
8”Records Management Quarterly Cumulative Index of Articles (Annotated) janvier 1967 – octobre 1977”, ARMA Records Management Quarterly, 11, Numéro 4 (1977), 63
9”28th Annual Conference ARMA The Emerging Information Economy”, ARMA Records Management Quarterly, 17, Numéro 3 (1983) 61
10”ICRM Information”, ARMA Records Management Quarterly, 19, Numéro 4 (1985), 62
11”Outlook – International Records Management Federation”, ARMA Records Management Quarterly, 12, Numéro 2 (1978), 55
12Outlook – International Records Management Federation”, ARMA Records Management Quarterly, 12, Numéro 2 (1978), 55
13”IRMF Reorganizes: Announces New Publications”, ARMA Records Management Quarterly, 16, Numéro 2 (1982)
14IRMF Reorganizes: Announces New Publications”, ARMA Records Management Quarterly, 16, Numéro 2 (1982)


1ARMA Records Management Quarterly, juillet 1978 pages 51-52
2ARMA Records Management Quarterly, juillet 1978 pages 51-52
3ARMA Records Management Quarterly, juillet 1983 page 56
4ARMA Records Management Quarterly, avril 1979 page 48
5ARMA Records Management Quarterly, avril 1980 page 52
6Some Personal Reflections on the History of RIM in Canada, Jim Coulson, CRM, FAI
7ARMA Records Management Quarterly, avril 1980 page 5
8ARMA Records Management Quarterly October 1977 page 63
9ARMA Records Management Quarterly, juillet 1983 page 61
10ARMA Records Management Quarterly, octobre 1985 page 62
11ARMA Records Management Quarterly, avril 1978 page 55
12ARMA Records Management Quarterly, avril 1978 page 55
13ARMA Records Management Quarterly, avril 1982 page 69
14ARMA Records Management Quarterly, avril 1982 page 69

CAN/CGSB 72.34-2017 ENREGISTREMENTS ÉLECTRONIQUES UTILISÉS À TITRE DE PREUVES DOCUMENTAIRES

Par Sharon Byrch, Uta Fox, CRM, FAI et Stuart Rennie JD, MLIS, BA (Hons.)

Introduction

En mars 2017, l’Office des normes générales du Canada (ONGC) a publié la nouvelle norme sur les Enregistrements électroniques utilisés à titre de preuves documentaires, CAN/CGSB-72.34- 2017. Cette nouvelle norme annule et remplace la version de 2005, CGSB 72.34-2005. Deux des auteurs de cet article, Uta Fox et Stuart Rennie, étaient membres du comité de l’ONGC qui a élaboré la CAN/CGSB-72.34-2017. Ayant co-présenté les deux versions de la norme CAN/CGSB-

  • dans les conférences nationales d’ARMA Canada et dans les sections d’ARMA Canada à travers le pays, ils sont fréquemment contactés afin de fournir plus d’informations sur cette norme. Cet article répond à ces demandes d’informations de la communauté de gestion des documents (GD) sur la mise à jour de 2017, et sur comment l’appliquer.

Afin d’élargir la portée de cet article, ils ont abordé Sharon Byrch, professionnelle de la GD, qui a accepté de collaborer à sa rédaction. Il traite des différences majeures amenées par la norme CAN/CGSB-72.34-2017 comparativement à la version de 2005, du point de vue juridique, de gestion des documents (GD) et des technologies de l’information (TI). Bien que l’accent soit mis principalement sur les changements importants au contenu, il contient certaines recommandations pour la mise en œuvre de la norme dans vos programmes de gestion des documents et de gouvernance de l’information (GI).

Dans cet article, les opinions exprimées par les auteurs sont personnelles et ne sont affiliées à aucun organisme. Les informations fournies sont basées sur leurs expériences et sur les connaissances qu’ils ont acquises, et sont uniquement partagées à titre informationnel, et non dans le but de fournir des conseils juridiques, techniques ou professionnels.

L’objectif de la norme CAN/CGSB-72.34-2017 est de suggérer des principes et des procédures que les organismes peuvent appliquer afin d’améliorer l’admissibilité de leurs enregistrements électroniques à titre de preuve dans les procédures judiciaires. La norme CAN/CGSB-72.34-2017 est une norme volontaire, disponible à l’achat en français et en anglais, et peut être obtenue en format papier ou en version électronique. La norme CAN/CGSB-72.34-2017 ne remplace pas les conseils juridiques et techniques d’experts.

L’ONGC est un organisme du gouvernement fédéral canadien qui aide à l’élaboration de normes dans de nombreuses industries. Actuellement, il y a plus de 300 normes à l’ONGC. Ces normes sont élaborées par un comité de bénévoles, experts dans leurs domaines respectifs. L’ONGC est accrédité par le Conseil canadien des Normes (CCN), une société d’état fédérale, à titre d’organisme d’élaboration de normes. Son mandat est de promouvoir des normalisations efficientes et efficaces au Canada. La norme CAN/CGSB-72.34-2017 est une norme nationale du Canada, comme en témoigne l’inclusion du préfixe «CAN» dans son numéro de référence, ce qui indique qu’elle est reconnue comme norme canadienne officielle dans un domaine particulier. 1

Lors de la lecture de la norme, les utilisateurs doivent être attentifs au langage utilisé par l’ONGC pour que les organismes puissent se conformer à la norme. Prenez le mot « doit » par exemple, souvent utilisé dans CAN/CGSB-72.34-2017. “Doit” implique des exigences obligatoires. Par exemple, “Un organisme doit établir le programme de GD”.2 Ceci est une déclaration d’obligation. Autrement dit, les organismes qui utilisent cette norme doivent avoir en place un programme de GD autorisé. Le mot « devrait » indique une recommandation tandis que « peut » correspond à une option, ou ce qui est permis dans les limites de la présente norme.

En plus de démontrer l’intégrité, l’authenticité et la fiabilité des enregistrements électroniques afin de répondre aux exigences en matière de preuve, les organismes doivent s’assurer que les enregistrements électroniques qu’ils créent, reçoivent et conservent dans les systèmes de tenue d’enregistrements électroniques sont conformes aux exigences du CAN/CGSB-72.34- 2017, afin d’assurer leur admissibilité dans les instances judiciaires. Comme indiqué:

Cette norme fournit un cadre et des lignes directrices pour la mise en œuvre et l’exploitation des systèmes de dossiers pour les enregistrements électroniques, que les renseignements qu’ils contiennent soient ou non éventuellement requis à titre de preuve. Ainsi, le respect de cette norme doit être considéré comme une démonstration de gestion responsable des affaires.

L’application de la norme aux procédures d’un organisme n’éliminera pas la possibilité de litiges, mais il est fort probable que cela rendra la production d’enregistrements électroniques plus faciles et leur acceptation dans une procédure juridique plus certaine.3

L’un des changements apportés grâce à la mise à jour CAN/CGSB-72.34-2017 est que l’ONGC a retiré une norme complémentaire : CAN/CGSB-72.11-93 Microfilms et images électroniques à titre de preuve documentaire. Ce retrait est entré en vigueur le 24 janvier 2017.4 Le raisonnement qui a mené au retrait de la norme CAN/CGSB-72.11-93 est dû à son utilisation limitée et à l’appui général au sujet de sa révision. De plus, la section 3 de la partie III et la section 3 de la partie IV, traitant des images électroniques dans la norme CAN/CGSB 72.11-93, sont incorporées dans la norme CAN/CGSB-72.34-2017. Les organismes peuvent toujours utiliser et référencer CAN/CGSB-72.11-93, mais cela n’est pas recommandé, puisqu’elle n’a plus de poids et n’est plus appuyée par l’ONGC. Par conséquent, les organismes devraient utiliser la version la plus récente de la norme, la CAN/CGSB-72.34-2017.

 

Exigences juridiques relatives aux enregistrements électroniques à titre de preuves documentaires

La section 5 de la norme CAN/CGSB-72.34-2017 détermine les exigences juridiques applicables aux enregistrements électroniques à titre de preuve documentaire.

Comme la version 2005, la nouvelle norme 2017 se concentre sur ces exigences juridiques en matière de recevabilité juridique:

[L ‘] utilisation d’un enregistrement électronique à titre de preuve exige la démonstration de l’authenticité de l’enregistrement, qui peut être faite en démontrant l’intégrité du système de dossiers électroniques dans lequel l’enregistrement est fait ou reçu ou stocké, et en démontrant que l’enregistrement a été fait “dans le cours normal des affaires” ou est par ailleurs exempté de la règle juridique interdisant le ouï-dire. 5

Il en résulte que les versions 2017 et 2005 sont essentiellement les mêmes, afin que les organismes utilisant la norme de 2005 puissent modifier leurs politiques, procédures et flux de travail pour se conformer à la nouvelle norme CAN/CGSB-72.34-2017 sans avoir à recommencer. Ce qui est pratique.

La norme CAN/CGSB-72.34-2017 fait constamment référence à la Loi sur la preuve au Canada et ne mentionne pas les lois provinciales ou territoriales. Les organismes qui ne sont pas assujettis à la juridiction fédérale canadienne devront examiner la loi provinciale ou territoriale qui s’applique à eux, au sujet de leurs enregistrements électroniques.

De plus, les organismes peuvent utiliser d’autres sources d’informations juridiques pour comprendre comment les cours et tribunaux du Canada interprètent ces exigences juridiques. Une source d’information utile à ce sujet est le site Web de l’Institut canadien d’information juridique: https://www.canlii.org/. CanLII, un organisme non gouvernemental à but non lucratif, offre en ligne un accès gratuit à des jugements de tribunaux, décisions judiciaires, lois et règlements provenant de toutes les provinces et de tous les territoires du Canada, avec une interface de recherche facile à utiliser. L’information juridique est disponible en anglais et en français. CanLII est l’une des meilleures sources d’information pour les avocats, ainsi que le grand public, pour consulter la loi canadienne.

La section5 de la norme CAN/CGSB-72.34-2017 souligne à juste titre que les organismes doivent prouver que leurs enregistrements électroniques sont juridiquement admissibles. Deux outils clés pour la preuve que les organismes peuvent utiliser sont: (1) Manuel de GD (2) Guide de gestion du système informatique. Ces outils devraient contenir les politiques et procédures autorisées de l’organisme servant à fournir les preuves documentaires que celui-ci satisfait les exigences en matière d’admissibilité juridique de ses enregistrements électroniques.

La norme CAN/CGSB-72.34-2017 comporte de nouveaux éléments pour refléter les changements technologiques depuis 2005:

  • Découverte électronique (preuve électronique);
  • Révision assistée par la technologie (RAT) utilisant un logiciel spécifique de recherche;
  • Mise en suspens pour des raisons juridiques; et
  • Signatures électroniques et

 

Découverte électronique (preuve électronique) et préparation aux litiges

Dans la section 5.3, pour la première fois, CAN/CGSB-72.34-2017 fait référence à la découverte électronique au cours de litiges civils. La découverte électronique est une procédure préalable au procès, où les parties en litige sont tenues d’échanger les enregistrements électroniques pertinents, sous la supervision des tribunaux. Puisque les organismes créent et enregistrent maintenant des enregistrements électroniques en plus grand nombre, leur défi est de pouvoir rechercher, consulter et produire les enregistrements électroniques pertinents requis par les tribunaux canadiens, afin qu’ils puissent être admissibles à titre de preuve dans les procédures judiciaires.

En 2008, pour aider les organismes à la production d’enregistrements électroniques admissibles au Canada, les Principes de Sedona Canada portant sur la découverte électronique (Principes de Sedona Canada) ont été élaborés. Sedona Canada est une organisation bénévole non gouvernementale regroupant avocats, juges et technologues à travers le Canada. Les Principes de Sedona Canada sont en grande partie une norme volontaire, comme l’ONGC, sauf en Ontario. Depuis 2010, les principes de Sedona Canada sont obligatoirement utilisés dans les tribunaux de l’Ontario, en vertu des Règles de procédure civile de la province. À l’extérieur de l’Ontario, les principes de Sedona Canada sont acceptés dans les cours et les tribunaux du pays pour aider les parties à produire et à consulter les enregistrements électroniques pertinents, en temps opportun et de manière efficace et économique. À l’heure actuelle, il existe une jurisprudence canadienne en cours de développement en matière de Principes de Sedona Canada.

En 2015, à l’instar des changements apportés par l’ONGC en 2017 à la norme CAN/CGSB-72.34- 2005, Sedona Canada, pour tenir compte des changements dans les domaines technologiques et juridiques, a publié une deuxième édition mise à jour des Principes de Sedona Canada.6

En ce qui concerne la divulgation dans les procédures pénales, le CAN/CGSB-72.34-2017 est référé, pour la première fois, dans R. c. Oler, 2014 ABPC 130 (CanLII).7 L’affaire Oler est un cas majeur pour plusieurs raisons. Premièrement, Oler est la première instance à considérer et à appliquer la norme CAN/CGSB-72.34 de 2005. Deuxièmement, Oler est aussi le premier cas au Canada à soutenir que le processus de numérisation d’un organisme est admissible à titre de preuve électronique, parce qu’il est conforme à la Loi sur la preuve au Canada, et à l’Alberta Evidence Act pour son utilisation des standards industriels. Troisièmement, le tribunal dans

l’affaire Oler est le premier au Canada à accepter la preuve d’expert en gestion des dossiers. L’expert dans l’affaire Oler était Uta Fox, co-auteure de cet article. Uta Fox est la première gestionnaire de GD reconnue au Canada pour avoir été acceptée à titre de témoin expert juridique qualifiée. À ce titre, elle a aidé le tribunal à déterminer l’admissibilité juridique d’enregistrements électroniques.

Il existe peu de jurisprudences au Canada sur la relation entre la GD et la loi. Il en faut plus, afin de faire progresser la profession de gestion des documents électroniques et de l’information, ainsi que l’admissibilité des documents, en particulier les enregistrements électroniques, dans les tribunaux canadiens. Cette situation devient d’autant plus critique au fur et à mesure que les organismes deviennent de plus en plus « numériques » et ne créent et gèrent que des enregistrements électroniques, et non des documents papier. Compte tenu de la prédilection des tribunaux canadiens à faire référence à des Normes nationales du Canada approuvées, telle que CAN/CGSB-72.34-20, l’affaire Oler est un pas dans la bonne direction pour la gestion des documents électroniques et de l’information.

Il est de plus en plus nécessaire que les professionnels de la GD soient acceptés devant les tribunaux en tant qu’experts, comme Uta Fox l’a été, pour les aider à déterminer l’admissibilité de la preuve électronique. Les tribunaux canadiens ne qualifient pas les personnes d’experts à la légère. Ils exigent des preuves démontrant qu’une personne possède les qualifications, la formation, les connaissances spécialisées et l’expertise nécessaires pour être qualifiées d’expert. De plus, la personne doit être indépendante et impartiale pour assister le tribunal en tant qu’expert.8 Les organismes qui, à l’instar du Service de police de Calgary dans l’affaire Oler, utilisent des systèmes de gestion des enregistrements électroniques et des documents et des dossiers (SGEDD), exploitent et gèrent des systèmes complexes de GD et de TI, avec des répercussions juridiques dépassant l’expérience du public ou de l’utilisateur occasionnel. Alors que de plus en plus d’organismes passent au SGEDD, il doit y avoir plus d’experts de GD attitrés par les tribunaux pour conseiller sur ces systèmes complexes, afin de déterminer l’admissibilité de leurs enregistrements électroniques dans les affaires civiles et criminelles. Oler est la première affaire au Canada qui autorise l’admissibilité des enregistrements électroniques numérisés à l’aide d’un SGDDE, à la place d’originaux sur papier. Nous devons développer la jurisprudence en matière de GD pour suivre le rythme de l’utilisation des SGDDE en affaires.

D’un point de vue juridique, ce sont des ajouts favorables à la norme. Les organismes peuvent se référer aux Principes de Sedona Canada et à l’affaire Oler, en plus de la norme CGSB 72.34- 2017, comme sources d’information sur les meilleures pratiques crédibles afin d’améliorer encore davantage la conformité juridique de l’organisme, et réduire ses risques juridiques.

 

Revue assistée par la technologie (RAT) et autres outils et techniques automatisés

La norme de 2005 est muette sur la revue assistée par la technologie. Elle est abordée pour la première fois dans la section 5.3.1 de CAN/CGSB-72.34-2017. La revue assistée utilise la technologie pour automatiser le processus d’identification des enregistrements électroniques pertinents. Comme la preuve électronique, CAN/CGSB-72.34-2017 fait référence aux Principes de Sedona Canada, et avec raison. Les organismes utilisent de plus en plus la revue assistée par la technologie, et la jurisprudence canadienne y fait référence, ce qui en fait un autre ajout favorable.

 

Mise en suspens pour des raisons juridiques

La section 5.4 de CAN/ONGC-72,34-2017 fait référence à une « mise en suspens pour des raisons juridiques », qui était absente de la norme de 2005. De plus en plus appliquée à tous les dossiers, une mise en suspens pour des raisons juridiques est un processus où un organisme conserve toute forme de document potentiellement pertinent lorsque le litige est raisonnablement prévu, ou en cours. Le risque encouru par un organisme pour ne pas avoir préservé les preuves électroniques sous une mise en suspens pour des raisons juridiques est que des preuves pertinentes soient détruites par inadvertance, et donc indisponibles aux tribunaux pour déterminer la responsabilité ou la décharge. Quand un organisme détruit par inadvertance des preuves importantes, cela augmente son risque d’avoir à subir des sanctions juridiques par les tribunaux, qui sont mécontents lorsqu’il y a destruction de la preuve, car cela pourrait servir à subvertir la primauté du droit.

La norme CAN/CGSB-72.34-2017 fait référence aux Principes de Sedona Canada concernant les mises en suspens pour des raisons juridiques. Ensemble, la CAN/CGSB-72.34-2017 et les Principes de Sedona Canada permettent aux organismes de bien comprendre ce qu’il faut conserver, comment l’appliquer et comment éviter le risque de détruire ou de spolier par inadvertance des preuves électroniques pertinentes. Les deux normes soulignent le besoin de conseils juridiques d’experts dès que possible lorsqu’une mise en suspens pour des raisons juridiques s’avère nécessaire.

 

Signatures

Pour ce qui est des signatures, la norme de 2005 ne comporte pas de section spécifique pour les signatures électroniques et papier, elle ne fait que les mentionner au passage.

Quant à la norme CAN/CGSB-72.34-2017, elle consacre la section 5.5 aux signatures électroniques et manuscrites. Une signature manuscrite est généralement faite en encre, avec un sceau de papier ou même avec de la cire traditionnelle. La distinction entre les signatures électroniques et manuscrites est utile. De nombreux organismes utilisent aujourd’hui simultanément les deux types de signatures; il y a souvent confusion et incertitude au sein même des organismes quant à l’admissibilité juridique de ces deux signatures. La norme CAN/CGSB-72.34-2017 fournit les exigences de base pour leur utilisation, et spécifie quand demander l’avis d’un expert pour assurer leur admissibilité juridique.

 

Copies papier authentifiées pour les procédures judiciaires (« copies certifiées »)

Pour les « copies certifiées », la norme de 2005 de la section 5.6 exigeait que les copies papier soient authentifiées comme une « copie certifiée » de l’original, avec une signature pour améliorer sa recevabilité et son importance devant les tribunaux. Cette authentification devait être documentée avec des procédures.

Dans la version 2017, la section 5.6 est énoncée essentiellement comme la norme de 2005. La principale différence est que la norme de 2005 ne mentionne pas les « dossiers électroniques », ils sont implicites; la norme CAN/CGSB-72.34-2017, quant à elle, utilise l’expression « enregistrements électroniques » pour préciser que les copies d’enregistrements électroniques doivent être authentifiées par signature comme étant des « copies certifiées » des enregistrements électroniques.

La norme CAN/CGSB-72.34-2017 ajoute également qu’un affidavit peut être utilisé comme enregistrement d’authentification; la norme de 2005 ne mentionne pas l’affidavit. La nouvelle référence à l’utilisation d’un affidavit est utile, puisque c’est un outil usuel de preuve dans le processus de litige.

 

Programme de Gestion des documents (GD)

Les deux versions de la norme CAN/CGSB-72.34 contiennent des dispositions sur l’établissement d’un programme de GD. La version 2005, intitulée « Création d’un système de GD » a été modifiée en 2017 pour devenir « Programme de gestion des documents d’archives ».

Avant de discuter des changements à la GD dans la norme CAN/CGSB-72.34-2017, passons en revue ce qui est reporté de la norme de 2005, puisque ce sont des éléments essentiels pour votre programme de GD. Les sections 6.1 à 6.4.1 de la norme CAN/CGSB-72.34-2017 traitent des éléments fondamentaux qu’un organisme doit posséder en ce qui concerne: les concepts de GD, principes, méthodes et pratiques adoptés par l’organisme doivent démontrer qu’un programme de GD approprié est en place et fait partie intégrante du processus habituel et normal des activités de l’organisme en question.9

Par conséquent, les organismes doivent avoir :

  • Un programme de GD autorisé;
  • Une politique de GD autorisée;
  • Un manuel de GD autorisé; et
  • Un archiviste.10

À la section 6.1, il est mentionné que le programme de gestion des documents lui-même « doit prendre en charge un système de dossiers composé de procédures et de contrôles des dossiers appropriés qui complémentent les procédures opérationnelles de l’entreprise. »11 Un organisme doit:

  • Établir le programme de GD;
  • Élaborer une politique de GD, avec des définitions et des attributions de responsabilités;
  • Concevoir des procédures de GD avec sa documentation connexe;
  • Sélectionner et mettre en œuvre des technologies soutenant le système de GD;
  • Établir des mesures de protection des documents, y compris des pistes de vérification et de sauvegarde; et
  • Établir un processus d’assurance qualité des documents.12

Remarquez, ces déclarations sont des déclarations “doit”; ce qui signifie que l’organisme doit soutenir les systèmes de GD en utilisant des procédures d’enregistrement et des contrôles appropriés en harmonie avec les opérations commerciales.

Les organismes doivent établir un programme de GD, avoir une politique GD, utiliser des définitions, attribuer des responsabilités, élaborer des procédures, sélectionner des technologies, mettre en œuvre des processus de protection des documents et adopter un programme d’assurance qualité. Ce sont des facteurs non négociables qui doivent être en place avant même que l’organisme puisse commencer à se conformer à cette norme.

 

Responsabilité – Archiviste

Les deux versions, soit celle de 2005 et de 2017 du CAN/ONGC72.34, imposent aux organismes de nommer un archiviste, qui sera responsable de la mise en œuvre du programme de GD, et qui assurera son intégration aux activités régulières de l’organisme. De plus, l’archiviste est responsable de certifier que l’organisme met en œuvre la politique, le programme, et le manuel de GD. Le rôle et les responsabilités de l’archiviste doivent être définis dans une politique, un règlement ou une directive, car son rôle est essentiel à l’établissement et au maintien du programme de GD, et à la supervision de la conformité de l’organisme aux lois, normes et politiques ayant un impact sur le programme.

La norme CAN/CGSB-72.34-2017 fournit des directives à l’archiviste en reconnaissant que la politique de l’organisme lui confère la responsabilité de maintenir et de modifier «…le manuel de GD, avec le soutien du personnel de TI, afin qu’il reflète l’état exact des documents d’archives et puisse constituer une preuve de la conformité du système avec la loi et cette norme.»13 Les responsabilités des employés ayant des fonctions de GD doivent également être définies à des fins de vérification de conformité. Cette directive s’applique à tous les employés d’un organisme qui ont des fonctions de GD, pas seulement les employés du département de GD. Afin de se conformer à la norme CGSB 72.34-2017, assurez-vous que votre organisme remplisse les exigences et les devoirs de la GD pour tous vos employés.

 

Politique de GD

Les deux versions de l’ONGC soulignent que la haute direction doit autoriser une politique de GD qui stipule que la gestion des enregistrements électroniques fait partie intégrante du cours normal des activités de l’organisme. De plus, la politique doit identifier et élaborer sur:

  • Les dossiers applicables, systèmes de dossiers et exclusions (portée);
  • Les normes pertinentes de GD et de la technologie de l’information, utilisées dans les programmes de GD;
  • La responsabilité de l’archiviste pour le système de dossiers;
  • La conformité du système de dossiers avec le manuel de GD, la loi, les normes nationales, internationales et industrielles;
  • La responsabilité de l’archiviste quant au maintien et la mise à jour du manuel de GD;
  • Les exigences relatives à la création, à la gestion, à l’utilisation et à la disposition des documents;
  • Le travail conjoint du département informatique avec l’archiviste pour intégrer la GD dans le processus habituel des affaires; et
  • Les processus de l’assurance qualité sont attribués à l’archiviste.14

 

Manuel de GD

Le manuel de GD est l’outil qui sert à regrouper toutes les procédures liées aux documents, afin d’assurer l’exhaustivité et la cohérence de la pratique15, et qui sont incluses dans les deux versions de la norme CAN/CGSB-72.34. Les composants requis d’un manuel de GD, par exemple les procédures pour faire, recevoir, sauvegarder et supprimer des documents, sont également cohérents entre les versions. CAN/CGSB-72.34-2017 souligne que le manuel de GD:

devra être tenu à jour et refléter avec précision la nature exacte, les fonctions, procédures et processus du système électronique de l’organisme, c’est-à-dire la façon dont ce système participe et soutient le processus habituel des affaires;16

La norme CAN/CGSB-72.34-2017 indique que le manuel doit spécifier le fonctionnement et l’utilisation des systèmes électroniques et inclure des références à d’autres documents pertinents, par exemple le Guide de gestion du système des TI, les procédures opérationnelles ou la documentation du système des TI.17 Voilà un autre exemple de la norme contraignant les TI et la GD à collaborer pour se conformer aux exigences.

Alors que la GD communique ses exigences différemment des TI, il est clair que ces deux domaines ont des intérêts communs et le même but final: une meilleure gouvernance de l’information à propos de la gestion des dossiers électroniques. Il est tout à fait stratégique pour la GD et les TI de collaborer, car supporter un domaine fait avancer les buts et les objectifs de l’autre et, finalement, les deux en bénéficient. Les TI et la GD sont essentiellement des expressions différentes d’un même objectif. Vos parties prenantes se superposent en partie, vous partagez les mêmes risques et conséquences, ressources et priorités.

Le futur de la GD dépend de son implémentation et intégration réussie, à l’intérieur même d’un organisme. Les rouages internes entre la GD et les TI sont décisifs pour cette réussite, et elle ne dépend que de l’implication des deux parties dans cette méthodologie stratégique. Le plus tôt cette coopération de travail, ce développement de ressources, cette communication et assistance pour faire avancer le travail est établie, le mieux cela sera.

Garder le manuel à jour, en utilisant un processus d’examen formel à des moments prédéterminés, garantis que le manuel de GD continue de refléter les activités de GD pour les enregistrements électroniques, en temps réel.18

Votre politique, votre programme, votre manuel et votre archiviste combinés forment les assises et les aspects fondamentaux de vos opérations de GD. Les organismes ne sont pas conformes à la norme CAN/CGSB-72.34-2017 s’ils ne disposent pas d’une politique, d’un programme, d’un manuel ou d’un archiviste et qu’ils sont appelés à produire leurs dossiers électroniques à des fins judiciaires. Comme l’indique la norme CAN/CGSB-72.34-2017, le respect de cette norme « devrait être considéré comme une démonstration de la gestion responsable des affaires ».19

 

Numérisation

Tel qu’indiqué à la page 3, la norme CAN/CGSB-72.34-2017 incorpore les éléments fondamentaux de Microfilms et images électroniques comme preuve documentaire (CAN/CGSB- 72.34-11-93) requis pour la numérisation. Les organismes qui mettent en œuvre un programme d’imagerie dans leurs programmes de GD peuvent utiliser la liste de contrôle suivante qui énumère les exigences de CAN/CGSB-72.34-201720, décrites dans le Tableau I ci-dessous. Les organismes ayant un programme d’imagerie existant peuvent aussi utiliser cette liste de contrôle comme outil de vérification pour démontrer leur conformité à la norme CAN/CGSB- 72.34-2017.

Cette liste de contrôle fournit les exigences « doit », directement prises de la norme (voir la colonne « Exigences CAN/CGSB-72.34-2017 » ci-dessous). Elle mentionne également les exigences qui doivent être adressées et appliquées, avec la colonne “Action requise”. Enfin, la colonne “Responsabilité” détaillera les postes responsables de la réalisation des actions. Cette liste de contrôle ne contient que des suggestions, et il est fortement recommandé de développer un système de suivi quelconque afin de documenter l’état de l’implémentation et son progrès.

 

Tableau I

Liste de contrôle pour la numérisation – 6.4.4.4 Numérisations


 

Exigences de conservation des dossiers

Un certain nombre de modifications ont été apportées à la section sur la conservation des documents dans la norme CAN/CGSB-72.34-2017. La version de 2005 traitait des exigences en matière de consommation, de biens et de services lors de l’attribution de périodes de conservation, tandis que la version de 2017 adopte une approche commerciale générale. La norme CAN/CGSB-72.34-2017 stipule que pour attribuer des périodes de conservation appropriées, les personnes autorisées responsables des fonctions organisationnelles prises en charge par les documents (par exemple, représentation juridique, finances, RH) doivent être incluses dans les décisions d’évaluation et de conservation. Idéalement, les organismes devraient établir un comité de gestion des documents, composé d’experts en la matière, du juridique, de la finance, des ressources humaines et autres responsables, de ces décisions. Les décisions prises par le comité d’autorisation attribuant les périodes de conservation pour les séries de documents doivent être documentées dans le cadre du calendrier de conservation des documents, et liées au système de classification.

La norme fournit plusieurs facteurs critiques d’évaluation qui peuvent ultimement définir les exigences de rétention, notamment:

  • Comment l’organisme utilise les documents, à l’interne et à l’externe;
  • Le besoin d’accès des utilisateurs en cas de sinistre;
  • La valeur financière, juridique, sociale, politique et historique des documents;
  • Analyse coûts/avantages de la rétention;
  • Impact sur l’organisme si les documents sont détruits; et
  • La valeur probante des dossiers si nécessaires pour un litige, un audit ou une enquête.22

Les autres changements apportés à la norme de 2017 portent sur le système de dossiers de l’organisme et sur sa capacité à tenir compte des conservations à date d’échéance fixe, par date d’événement et permanente. L’archiviste doit examiner toutes les dispositions relatives aux dossiers avant qu’une décision de destruction ne soit prise afin de s’assurer que les documents dont l’élimination est prévue ne sont pas assujettis à une mise en suspens pour des raisons juridiques, ou un contrôle organisationnel ou gouvernemental.23 Si tel est le cas, les documents doivent être suspendus du processus de destruction, tel que décrit à la section 5.4 Mise en suspens pour des raisons juridiques de la norme CAN/CGSB-72.34-2017.

 

Disposition des documents

Il y a des changements importants à la section sur la disposition dans la norme CAN/CGSB- 72.34-2017. La version de 2005 reconnaissait la disposition qui comprenait la destruction des documents et le transfert à une autre entité, mais il n’y avait pas de discussion sur le transfert des documents à un autre organisme ou sur la conservation des dossiers.

La norme CAN/CGSB-72.34-2017 rectifie cette situation en fournissant des détails substantiels sur le processus de disposition, la destruction des enregistrements électroniques, le transfert des documents à une autre entité et les pratiques de préservation des dossiers conservés de façon permanente ou à long terme.

 

Destruction d’enregistrements électroniques

Tel que mentionné, les deux versions de la norme reconnaissent que les documents doivent d’abord répondre aux exigences de conservation et être autorisées à être éliminées/détruites avant de commencer toute action de disposition. La norme CAN/CGSB-72.34-2017 précise que le manuel GD doit permettre au système d’enregistrement de détruire, de modifier ou de corriger les documents, à l’aide d’un processus modifiable. Quant aux documents qui ont été détruits, les procédures du système doivent assurer que le document, ainsi que son localisateur, sont détruits. Il stipule également que la destruction des documents doit être complétée de manière à ne pas compromettre la confidentialité et la protection des renseignements personnels.24

 

Transfert d’enregistrements électroniques à une autre entité

Dans la section 6.4.6.4 de la norme CAN/CGSB-72.34-2017, il est stipulé que le manuel GD d’un organisme doit documenter tous les documents transférés et acceptés par les archives (ou une autre entité), et que l’organisme qui transfère les documents et celui qui les reçoit doivent maintenir cette documentation. L’organisme destinataire peut exiger des informations supplémentaires telles que: l’identité du matériel et des logiciels qui ont généré les enregistrements, la documentation du programme décrivant le format, les codes de fichier, la disposition des fichiers et autres détails techniques du système de documents dans lequel les documents résidaient.25

 

Conservation des documents

La section 6.4.6.5 de la norme CAN/CGSB-72.34-2017 reconnaît que les organismes doivent s’assurer que leurs manuels de GD insistent sur le fait que la conservation des documents débute avec la création et la conservation des documents en formats de fichiers conservables, qui enregistrent les métadonnées d’identité et de tenue de documents requis. Tout au long du cycle de vie des documents, les organismes doivent démontrer que les enregistrements électroniques ont été créés, reçus et conservés dans le cours normal de leurs affaires et continuent de répondre aux exigences d’admissibilité en assurant leur authenticité.

Les organismes tenus de conserver des documents de manière permanente doivent s’assurer que leurs systèmes de tenue de documents sont capables de tenir en permanence des registres

– surtout à considérer lors de la sélection des systèmes de dossiers, tout comme la protection des documents contre l’obsolescence de logiciels.26

 

Conversion des documents et migration

La section 6.4.6.5.2 de la norme CAN/CGSB-72.34-2017 décrit la conversion des documents comme étant le changement d’informations enregistrées d’un format à un autre, tandis que la migration est le déplacement d’informations enregistrées d’un système des TI à un autre (les deux sont des méthodes d’éviter l’obsolescence des programmes, et les deux comportent des risques). La norme dénote qu’il y a deux types d’obsolescence :

  1. L’obsolescence des formats de fichiers – lorsque le programme ne peut ouvrir ou visionner le contenu du document. L’obsolescence des formats de fichiers peut être évitée par la conversion des documents ou le déplacement d’un document d’un format à un autre;
  2. L’obsolescence des systèmes – lorsqu’un système ou une application n’est plus supporté et les documents ne peuvent pas être récupérés. L’obsolescence des systèmes peut être remédiée par la migration des documents électroniques à un nouveau système.27

Les organismes doivent avoir une politique de conversion et de migration. De plus, cette politique doit être documentée dans votre manuel de GD et inclure les procédures adoptées pour assurer la protection et la préservation de la structure, du contenu, de l’identité et des métadonnées de conservation des documents. Cette protection et cette préservation doivent s’étendre à tous les dossiers électroniques, y compris les courriels, leurs pièces jointes, les liens, les preuves de livraison, les listes de distribution et les liens avec d’autres documents de l’organisme.

Toutefois, avant que toute activité de conversion ou de migration ne commence, la norme stipule que l’organisme doit identifier les éléments suivants:

  • les fonctionnalités requises de l’ancien format;
  • les fonctionnalités qui doivent être maintenues dans le nouveau format et le nouveau système; et,
  • documenter toutes les décisions.

Surtout, la migration et la conversion doivent toutes deux être intégrées dans un processus d’affaires documenté, faisant partie du fonctionnement habituel du système électronique.28

 

Formats de conservation

La section 6.4.6.5.2 de la norme CAN/CGSB-72.34-2017 suggère aux organismes d’identifier les formats privilégiés pour la conservation des documents par type de documents. C’est-à-dire, quel type de document sera conservé et dans quel format. Les décisions concernant les formats de conservation doivent être fondées sur l’amplitude du changement qui peut être effectué avant que le document ne devienne trop dégradé pour servir de copie fiable au cours d’une procédure judiciaire. De plus, les décisions concernant le format de conservation doivent être documentées dans le manuel de GD. L’annexe C, sur la conservation, fournit des informations précieuses sur les formats de conservation.29

 

Assurance qualité

Les versions de 2005 et de 2017 exigent toutes deux que les organismes intègrent des mesures et des processus d’assurance qualité dans leurs programmes de GD.

L’assurance qualité signifie que le niveau de service approprié est défini et que le personnel connaît et comprend ce niveau de service. De plus, il faut que le personnel ait reçu la formation nécessaire pour offrir le niveau de service défini.

La mise en œuvre des processus d’assurance qualité est à la charge de l’archiviste et doit inclure l’évaluation du rendement, la surveillance de la conformité, les autoévaluations, les audits externes, le traitement des incidents et la documentation et certification du respect de toutes les obligations de GD. Les problèmes importants doivent être signalés à la haute direction, afin d’effectuer des modifications au programme de GD, si nécessaire.30

 

Éléments informatiques

Compte tenu de la tendance actuelle à la numérisation, les organismes adoptent des technologies plus rapidement que leur permet leur capacité à incorporer des protocoles appropriés dans leurs systèmes, planification technologique et gestion. Cela conduit à un chaos d’information et à des risques de non-conformité accrus, surtout lorsque les documents s’accumulent et se distribuent dans les différents systèmes, espaces, services, dispositifs et formats, et sous la garde des employés. Forbes et d’autres spécialistes du secteur estiment actuellement que l’information électronique double chaque année.31 Entre temps, sur le volume total de documents (papier et électronique), de nombreux utilisateurs des systèmes de GD estiment qu’approximativement 5-10% nécessiteront une conservation permanente. Ils conviennent, d’après l’expérience de Sharon Byrch, qu’approximativement 60% à 80% de l’information électronique d’un organisme consiste en des documents indésirables  documents transitoires avec une valeur à court terme seulement, non durable  tels que duplicatas, copies à titre informatif, versions obsolètes et travaux incomplets ou abandonnés; ou des documents qui sont expirés et ne sont plus nécessaires. Ces documents pourraient être légalement détruits, et de façon routinière, s’ils étaient facilement identifiables parmi les documents de l’entreprise qui sont encore nécessaires.

Les professionnels de l’informatique savent que le fouillis d’informations est coûteux. Les besoins en stockage de données et les coûts associés ne cessent d’augmenter à mesure que les entreprises continuent d’étendre leur empreinte numérique. De plus, trier l’information pertinente dans le fouillis est coûteux, en temps et en argent, pour les TI et la GD, et a un impact sur les employés; l’encombrement fait qu’il est difficile pour les employés d’identifier les documents transitoires parmi les documents officiels des entreprises lorsqu’ils recherchent des informations fiables et qui font autorité pour faire leur travail. C’est une situation désavantageuse pour toutes les parties concernées, et un facteur de motivation pour le changement et la mise en œuvre de la norme CAN/CGSB-72.34-2017, afin d’avoir une meilleure gestion des dossiers électroniques, une conformité accrue à la loi et une réduction des risques.

Au même moment où les entreprises se tournent vers le numérique pour gérer leur entreprise, les documents sur papier (analogiques) deviennent de plus en plus des documents d’archives. Par exemple, Sharon Byrch a constaté que, dans un organisme pour laquelle elle travaillait, il y avait un ratio d’environ 1:1000 de documents papier à numériques, ce qui signifie que pour chaque document papier créé, l’organisme a créé 1000 enregistrements électroniques. Ce ratio est une progression naturelle, en raison du déclin de la création de documents papier, et est courant dans plusieurs organismes.

En toute honnêteté, si les organismes ne s’assurent pas que leurs documents sont saisis et rendus facilement accessibles pour usage électronique, ou géré adéquatement en tant que documents (y compris à des fins juridiques, de confidentialité et de sécurité), et conséquemment ne satisfont pas aux critères d’admissibilité pour des procédures judiciairesces organismes gaspillent des ressources inestimables, leur temps et leur argent. Étant donné le taux actuel d’augmentation et de diffusion de l’information que connaissent les organismes de toutes les industries, il est réaliste de penser que les organismes ne puissent plus rattraper leur retard. Pour surmonter cette situation, les organismes doivent repenser leur approche sur la gestion des enregistrements électroniques.

 

Guide de gestion du système informatique

Un changement important pour les TI est que la norme CAN/CGSB-72.34-2017 a mis à jour toutes les références au « guide de gestion du système », l’étape clé et la responsabilité des TI, à « Guide de gestion du système de TI ». La première instance apparaît dans la section 6.5.1 Guide de gestion du système informatique. La section 6.5.1 présente la directive générale des TI à propos de la norme CAN/CGSB-72.34-2017:

Tous les détails significatifs de l’architecture logique et physique du système informatique conservant les documents seront entièrement documentés dans le guide de gestion du système informatique, y compris les responsabilités et les relations entre la gestion du système informatique, le programme de GD et la conduite des affaires. Le guide de gestion du système informatique doit être structuré de telle sorte que l’intégrité du système puisse être démontrée à tout moment.32

Se conformer à cette directive des TI est essentiel pour faire valoir avec succès l’argument juridique en faveur de la recevabilité des documents à titre de preuve documentaire dans les procédures judiciaires. C’est la raison d’être de cette norme, du point de vue des TI, et est directement liée à la Loi sur la preuve au Canada à l’alinéa 31.2 (1) a) de sorte que:

Par principe, la meilleure preuve à l’égard d’un enregistrement électronique est satisfaite

  • sur preuve de l’intégrité du système d’enregistrements électroniques par ou dans lequel l’enregistrement électronique a été enregistré ou stocké.33

La clé de voûte pour les TI est que la documentation de gestion du système informatique est la façon de démontrer l’intégrité du système informatique pour la gestion des enregistrements électroniques et ensuite, les faire répondre aux exigences d’admissibilité comme preuve.

L’aspect critique est que l’intégrité des enregistrements électroniques dépend entièrement de l’intégrité du ou des systèmes qui les gèrent; ceci est mieux décrit et documenté par le personnel informatique, qui est expert en systèmes et principalement responsable de sa gestion technique. C’est de cette façon fondamentale que le service informatique prend en charge les exigences juridiques en matière de gestion des documents pour la gestion des dossiers numériques afin d’assurer leur admissibilité future dans les procédures judiciaires.

Il est important de noter qu’il est obligatoire que le service des TI maintienne son manuel des systèmes informatiques à jour comme preuve de conformité, comme la GD l’exige avec son manuel de GD. Bien que pratiquement parlant, ces documents sont constamment en cours de révision. Il est recommandé aux organismes de s’assurer qu’elles conservent les versions antérieures, comme preuve de leur conformité passée.

Bien que l’exigence d’avoir un guide de gestion du système n’ait pas changé depuis la norme de 2005, la norme CAN/CGSB-72.34-2017 offre une meilleure répartition entre l’informatique et la gestion des documents. Ceci est particulièrement vrai depuis que les sujets du guide de gestion du système informatique sont distincts des rubriques du manuel de procédures GD, qui se trouvent dans la section 6.4 Manuel. Cela les rend moins déroutants pour l’utilisateur. Les professionnels de la TI et de la GD trouveront intéressant d’examiner la répartition des responsabilités entre ces sections.

 

Piste de vérification

Dans la section 6.5.5 de la norme CAN/ONGC-72,34-2017, alors que les exigences techniques pour les pistes de vérification sont décrites comme une responsabilité des TI, elles sont prioritaires pour la GD et les départements juridiques. Par conséquent, la GD et le juridique sont des parties prenantes clés pour les TI; ils sont soucieux des exigences en matière de preuve qui sont saisies et démontrées par les pistes de vérification. Il est judicieux que les TI incluent ces deux parties prenantes dans le processus de gestion TI.

Voici quelques conseils pour les TI sur leur collaboration avec la GD à propos des pistes d’audit: la GD est particulièrement bien placé pour valider que la piste de vérification d’un système saisit et enregistre les aspects nécessaires à la preuve, puisqu’elle travaille en étroite collaboration avec les entreprises propriétaires des documents, pour établir quels documents exigent la saisie et la gestion comme preuve liée à leurs activités commerciales. En conséquence, les employés des TI devraient être familiers avec les technologies concernées. De plus, la GD détermine le cycle de vie des documents et leurs exigences en matière de gestion par le biais d’un processus d’évaluation détaillé, et saura quel est le(s) but(s), le ou les utilisation(s) et les exigences juridiques des documents, leurs utilisateurs, le(s) format(s) requis, le délai de conservation des documents et comment en disposer une fois qu’ils ne sont plus requis pour les affaires.

La connaissance approfondie du service de GD sur la collecte des documents de l’organisme, de leurs utilisateurs et de leur gestion, y compris les technologies impliquées, en fait un atout considérable pour les TI. Ceci est particulièrement important lorsque le département de TI envisage, teste et met en œuvre des modifications du système, de nouvelles versions, des mises à niveau ou de nouvelles technologies susceptibles d’avoir un impact sur les pistes de vérification de façon inhabituelle, imprévue ou indétectable par eux. En tant que meilleure pratique, les TI devrait impliquer la GD dès le début du processus et leur laisser le temps d’identifier et de tester les risques et problèmes éventuels qui pourraient nécessiter des mesures d’atténuation et de la documentation pour référence future.

 

Nouvelles technologies

Dans la section 7 de la norme CAN/CGSB-72.34-2017, sont présentées pour la première fois des sections concernant :

  • 2 Infonuagique;
  • 3 Médias sociaux; et
  • 4 Appareils mobiles, y compris, apportez votre équipement personnel de communication (AVEC) et utilisez votre propre nuage (UVPN).

Dans la norme CAN/CGSB-72.34-2017, chacune de ces technologies a aussi sa propre annexe avant la bibliographie. Cependant, et surtout, avant d’aborder ces sujets technologiques, la section 7.1 détermine de façon globale la nécessité de procéder à une évaluation des risques avant d’adopter une nouvelle technologie.

 

Évaluation des risques

À la section 7.1 de la norme CAN/CGSB-72.34-2017, comme on pouvait s’y attendre, il est recommandé d’adopter une approche multidisciplinaire pour l’évaluation des risques. Pour ce faire, il faut créer une équipe d’intervenants clés représentant les archives, le juridique, la sécurité, la confidentialité, les TI et la gestion des risques. Il est important que les organismes déterminent si leurs fournisseurs de services pourraient ou *devraient* faire partie de ces parties prenantes.

Selon les exigences juridiques, les politiques de confidentialité et de sécurité et le processus de gestion des risques, les organismes peuvent être amenés à impliquer les prestataires de services dans la conduite des évaluations des risques. Cela est particulièrement le cas pour les organismes publics, ou organismes qui fournissent des services à des organismes publics, qui peuvent être tenus, de par la loi sur l’accès à l’information applicable au statut de leur juridiction, de traiter les prestataires de services, et leurs agents et/ou sous-traitants, comme des organismes publics. Dans de tels cas, les organismes publics sont responsables au final des activités de leurs fournisseurs de services. Cela comprend: l’accès à l’information, la divulgation, la gestion du stockage et l’emplacement de sa juridiction, la sécurité, la sauvegarde/ reprise après sinistre et la gestion des violations de la vie privée. Habituellement, dans des cas comme celui-ci, un processus formel d’évaluation des facteurs relatifs à la vie privée (ÉFVP) est prévu par la loi sur l’accès à l’information. De plus, il existe des outils pour documenter l’évaluation des risques. Ce processus peut être capitalisé aux fins de CAN/CGSB-72.34-2017.

L’utilisation d’une approche multidisciplinaire pour effectuer des évaluations des risques est nécessaire pour: 1) examiner pleinement les avantages et les risques de la mise en œuvre de nouvelles technologies et 2) élaborer une analyse de rentabilité solide pour leur mise en œuvre ou leur abandon. Le résultat final de la conduite d’un processus d’évaluation des risques est la création d’un outil d’information précieux et d’un atout servant à plusieurs fins:

  • informer le service des communications afin d’aviser les cadres supérieurs/décideurs des risques, des menaces et des avantages;
  • orienter l’élaboration de nouvelles politiques et procédures d’atténuation et de gestion des risques, au besoin;
  • établir un processus réutilisable et faire l’analyse comparative de la nouvelle technologie pour développement futur et propositions; et
  • servir de document de traçabilité, nécessaire afin de démontrer les considérations, les décisions, les activités et activités subséquentes liées au processus d’évaluation des risques et à la mise en œuvre ou à l’abandon de la technologie.

Il est recommandé aux organismes d’élaborer un modèle qu’ils peuvent réutiliser pour d’autres projets d’évaluation des risques. De cette manière, les organismes peuvent développer une approche et une documentation uniformes. De plus, ils peuvent évaluer et mettre à jour leurs modèles au besoin, pour assurer la conformité. Conjointement, les organismes devraient conserver les versions précédentes pour référence chronologique.

Bien que ce ne soit pas directement mentionné dans la norme CAN/ONGC-72,34-2017, les organismes peuvent utiliser deux normes de l’Organisation internationale de normalisation (ISO) pour renforcer davantage leurs systèmes informatiques. La première est la norme ISO/TR 18128: 2014, Information et documentation Évaluation du risque pour les processus et les systèmes d’enregistrement. L’ISO/TR 18128: 2014 peut être utilisé par les organismes pour évaluer les risques aux dossiers et systèmes, afin qu’ils puissent répondre aux besoins opérationnels identifiés. La deuxième norme est ISO 15489-1: 2016, Information et documentation – Gestion des documents – Partie 1: Concepts et principes. L’ISO 15489-1: 2016 définit les concepts et principes que les organismes peuvent utiliser en ce qui concerne la création, la saisie et la gestion des documents, y compris les enregistrements électroniques.

Voici un dernier conseil sur les fournisseurs de services, peu importe si votre organisme est légiféré pour impliquer les fournisseurs de services dans vos processus d’évaluation des risques. En réalité, si les fournisseurs de services sont impliqués dans la saisie, le stockage, la maintenance ou le partage des documents d’un organisme, ils ont un rôle à jouer dans l’évaluation des risques, et devraient être inclus dans une meilleure pratique. Les organismes devront comprendre, évaluer et documenter leur système d’architecture et leurs protocoles de gestion. Ces protocoles incluent: la gestion de la sécurité, la gestion du stockage des données, la résolution de problèmes et la gestion technique. L’organisme doit ensuite utiliser ces protocoles pour: (1) détecter les risques ou les menaces qui nécessitent des mesures d’atténuation et de communication (2) établir des références pour le développement futur et (3) créer la documentation de traçabilité requise pour le processus d’évaluation des risques.

Un autre aspect important de l’évaluation des risques, bien qu’il ne soit pas explicitement mentionné dans la norme CAN/CGSB-72.34-2017, est qu’il est recommandé que les organismes effectuent leurs évaluations des risques au début du processus d’examen de la nouvelle technologie. De plus, vous devez vous assurer d’impliquer vos principales parties prenantes, surtout les TI, la gestion des documents, la sécurité, la confidentialité et le juridique. Selon votre situation, vous devrez peut-être impliquer des parties prenantes externes. Par exemple, en Colombie-Britannique, au Nouveau-Brunswick et en Nouvelle-Écosse, les organismes publics sont tenus de soumettre leurs évaluations des facteurs relatifs à la vie privée au gouvernement et/ou à leur commissaire respectif à l’information et à la protection de la vie privée.34 L’examen externe peut avoir des impacts importants sur les délais, la portée et l’exécution des projets, de sorte que les organismes devraient intégrer l’évaluation des risques dans les exigences et les échéanciers de chaque projet technologique dès les premières phases. Les phases s’appliquent aux nouvelles technologies ou aux changements à une technologie existante. Ils peuvent inclure: mises à niveau, modules complémentaires et nouveaux modules ou intégration avec d’autres systèmes. N’oubliez pas que les changements apportés à une technologie existante peuvent être tout aussi risqués que la mise en œuvre de nouvelles technologies, selon la portée du changement en cause.

 

Infonuage

La section 7.2 de la norme CAN/CGSB-72.34-2017 décrit généralement l’infonuage. L’annexe D donne des informations plus détaillées sur l’infonuage. La norme CAN/CGSB-72.34-2017 recommande que l’informatique en infonuage, si elle est utilisée par un organisme, soit documentée dans le guide de gestion du système de TI et le manuel de GD.

 

Médias sociaux

La section 7.3 de la norme CAN/CGSB-72.34-2017 décrit généralement les médias sociaux, notamment: identification, auteur/propriété, contexte, fiabilité, exactitude et authenticité, et chaîne de responsabilité. Pour résoudre ces problèmes, les organismes sont invités à adopter («doit») une politique sur les médias sociaux qui prévoit la gestion du cycle de vie des affichages. L’affichage est considéré comme un document. L’annexe E donne plus d’informations sur les médias sociaux. À l’instar de l’infonuagique, si un organisme utilise les médias sociaux, cela devrait être documenté dans le guide de gestion du système informatique et le manuel de GD.

 

Appareils mobiles

La section 7.4 de la norme CAN/CGSB-72.34-2017 décrit généralement les appareils mobiles, comme AVEC et UVPN. L’annexe F traite des appareils POUT (Propriété de l’organisme, utilisation pour le travail) ou POUP (Propriété de l’organisme, utilisation personnelle). Les professionnels de TI trouveront la section 7.4 et l’annexe F particulièrement utiles, puisqu’elles donnent un bon aperçu des questions et implications juridiques, et du besoin de politiques, d’évaluation des risques, de procédures, de sécurité stricte, etc. Les solutions de gestion des appareils mobiles (Mobile Device Management (MDM)) ou de gestion de la mobilité en entreprises (Enterprise Mobility Management (EMM)) sont également abordées.

 

Annexes

Les 6 annexes de la norme CAN/CGSB-72.34-2017 sont des ajouts. Ils contiennent des instructions “devrait” uniquement. Ces 6 annexes sont:

  1. Annexe A: Sources pour cette
  2. Annexe B: Métadonnées.
  3. Annexe C: Formats de préservation.
  4. Annexe D:
  5. Annexe E: Médias
  6. Annexe F: Appareils

L’évaluation des risques du BYOD à l’annexe F est extrêmement utile et sa lecture est recommandée à toutes les parties prenantes. L’annexe F offre une excellente vue d’ensemble des principaux domaines d’évaluation des risques, pour toutes les parties prenantes: sécurité, redondance, vérifiabilité, propriété des données, chaîne de responsabilité, gestion de l’information, conservation et mise en suspens pour des raisons juridiques, disposition des actifs informationnels, intégrité des données, licenciement des employés et perte de revenus. Penser à la gestion des risques de cette façon sera utile dans d’autres situations auxquelles un organisme est confronté.

 

Bibliographie

La bibliographie de CAN/CGSB-72.34-2017 énumère 18 sources. Parmi les 18 sources énumérées, 5 sont des normes informatiques. Parmi ceux-ci, 2 sont également trouvés dans la version 2005:

  • CAN/CSA-ISO/CEI 11179-3-04, Technologies de l’information – Registres de métadonnées (RMD) – Partie 3: Métamodèle de registre et attributs de base; et
  • CAN/CSA-ISO/IEC 14662-01, Technologies de l’information – Modèle de référence EDI- ouvert.

En 2010, la norme CAN/CSA-ISO/IEC 14662-01 a été révisée et remplacée par la norme CAN/CSA-ISO/IEC 14662-10 (R2015). CAN/CSA-ISO/IEC 14662-10 (R2015) est toujours d’actualité.

En 2013, la norme CAN/CSA-ISO/IEC 11179-3-04 a été révisée et remplacée par l’ISO/CEI 11179- 3: 2013. La norme ISO/CEI 11179-3: 2013 est toujours d’actualité.

Les trois nouvelles normes de TI présentées dans CAN/CGSB-72.34-2017 sur la sécurité sont les suivantes:

  • ISO/IEC 27001 Technologies de l’information – Techniques de sécurité – Systèmes de gestion de l’information – Exigences;
  • ISO/IEC 27002 Technologies de l’information – Techniques de sécurité – Code de bonnes pratiques pour la gestion de la sécurité de l’information; et
  • ISO/IEC 27005 Technologies de l’information – Techniques de sécurité – Gestion des risques liés à la sécurité de l’information.

Entre temps, les sources GD sont passées de 3 à 12 en 2017; 5 normes ISO pour la GD ont été introduites, pour un total de 7 normes ISO pour la GD. ARMA International avait 4 rapports techniques et lignes directrices.

La Loi uniforme sur la preuve électronique, qui avait également sa propre annexe dans la version 2005 de l’ONGC, a été abandonnée par la norme CAN/ONGC-72,34-2017. Dans la norme CAN/CGSB-72.34-2017, une nouvelle source, les Principes de Sedona Canada traitant de la découverte numérique (deuxième édition), a été ajoutée. Les changements dans les sources utilisées de 2005 à CAN/ONGC-72,34-2017 démontrent clairement le changement général de vocabulaire et de langage de TI à la GD qui a eu lieu dans les affaires en général.

Les organismes pourraient souhaiter investir dans l’acquisition de ces cinq normes informatiques, ainsi que les 12 normes de source de GD, rapports techniques et directives énoncées dans la bibliographie de la norme CAN/ONGC-72,34-2017, pour les ajouter à l’ensemble de leurs connaissances d’entreprise pour la gestion des enregistrements électroniques. Au moment d’écrire ces lignes, le CEA, la LPRPDE et Sedona Canada sont disponibles gratuitement en ligne; ceux qui sont intéressés par la cybersécurité, la loi sur la vie privée et sur la protection des données, pourraient trouver ces ressources juridiques précieuses.

 

Conclusion : Obtenez CAN/CGSB-72.34-2017 et implémentez-le !

Tous les organismes devraient envisager d’investir dans la mise en œuvre de la norme CAN/CGSB-72.34-2017, afin de démontrer clairement leur engagement à préserver la fiabilité de leurs documents et à préserver les preuves juridiquement admissibles par la gestion responsable de leur entreprise. Les TI et la GD sont les responsables techniques de la mise en œuvre de la norme CAN/CGSB-72.34-2017. En conséquence, les TI et la GD doivent développer une approche robuste et intégrée. L’amélioration de la gestion des connaissances organisationnelles, de la prise de décision et de la prestation de services d’information sont parmi les principaux avantages, nécessaires au personnel dans leur travail quotidien. Ce sont des atouts importants pour les organismes afin d’atteindre leurs objectifs à long terme: réduction des risques liés à l’information et meilleure conformité.

L’application de la norme CAN/CGSB-72.34-2017 peut apporter un réel avantage pratique aux organismes. La norme CAN/CGSB-72.34-2017 peut aider à:

  • augmenter la conformité juridique;
  • assurer l’admissibilité juridique des enregistrements électroniques;
  • réduire le chaos et l’encombrement de l’information numérique;
  • augmenter la saisie et la gestion de «source unique de données» et la rendre disponible électroniquement pour l’utilisation; et
  • protéger les informations contre l’accès non autorisé, la divulgation, les modifications, l’utilisation et la suppression/destruction.

Notre dernier point majeur, la mise en œuvre de normes, de systèmes et de procédures appropriées pour la gestion des enregistrements électroniques améliore la gouvernance de l’information. Il aide les organismes à créer un bassin stable d’informations faisant autorité, juridiquement conformes, fiables et sûrs et prêtes à être optimisées et exploitées par des systèmes et des technologies. Le résultat final est une situation gagnante pour tout le monde. Vous pouvez réduire le risque lié à l’information et améliorer la conformité, tandis que le personnel et la direction peuvent obtenir de meilleurs résultats avec moins d’effort pour le travail de gestion d’information qu’ils sont tenus de faire. Ceci, amis juridiques, GD et IT, est le ticket gagnant, et ce que nous aspirons à faire.

 

 

1“Standards Council of Canada,” What is a National Standard of Canada (NSC)?, https://www.scc.ca/en/faq-what- is-a-national-standard-of-canada, accessed on August 24, 2017.
2Section 6.1 General, page 14.
3Section 0.1 About this standard, page iv.
4Voir https://www.tpsgc-pwgsc.gc.ca/ongc-cgsb/programme-program/normes-standards/notification/sect1b- eng.html.
5Section 5.2 Requirements for admissibility of electronic records as documentary evidence, page 9.
6Voir https://thesedonaconference.org/publication/The%20Sedona%20Canada%20Principles.
7Voir http://canlii.ca/t/g7n43.
8The leading test for Canadian courts to qualify experts is White Burgess Langille Inman v. Abbott and Haliburton Co., 2015 SCC 23 (CanLII), http://canlii.ca/t/ghd4f
9Section 6.1 Général, page 14.
10Ibid.
11Ibid.
12Ibid.
13Section 6.3.2 Content of the policy, pages 15-16.
14Ibid.
15Section 6.4.1 General, page 16.
17Ibid.16 Ibid.
18Ibid.
19Section 0.1 About this standard, page iv.
20Section 6.4.2.2 Numérisation, page 17.
22Section 6.4.5 Records retention requirements, page 18.
23Ibid.
25Section 6.4.6.4 Transfer of electronic records to another entity, page 19.
26Section 6.4.6.5 Records preservation, page 19.
28Ibid.
29Ibid.
30Section 6.4.7 Quality assurance, page 20.
31http://www.forbes.com/sites/gilpress/2013/05/28/a-very-short-history-of-data-science/#7f0fa8cb69fd
32Page 20.
33R.S.C. 1985, c. C-5, http://canlii.ca/t/52zlc
34See sections 69 and 69.1 of the Freedom Of Information And Protection Of Privacy Act, R.S.B.C. 1996, c. 165 (http://canlii.ca/t/52sth), section 77 of the Right to Information and Protection of Privacy Act, S.N.B. 2009, c. R-(http://canlii.ca/t/52wbq) and section 72 of the Access to Information and Protection of Privacy Act, 2015, S.N.L. 2015, c. A-1.2 (http://canlii.ca/t/52wbq).

CAN/CGSB 72.34-2017 ELECTRONIC RECORDS AS DOCUMENTARY EVIDENCE

 By Sharon Byrch, Uta Fox, CRM, FAI and Stuart Rennie JD, MLIS, BA (Hons.)

 

Introduction

In March 2017, the Canadian General Standards Board (CGSB) released the new Electronic Records as Documentary Evidence standard, CAN/CGSB-72.34-2017. The new CAN/CGSB-72.34- 2017 supersedes and replaces the 2005 version, CGSB 72.34-2005. Two authors of this article, Uta Fox and Stuart Rennie were members of CGSB committee that developed CAN/CGSB-72.34- 2017. Having co-presented on both versions of CAN/CGSB-72.34 at ARMA Canada’s national conferences and ARMA Canada chapters around the country, they are continually being contacted to provide more information on this standard. This article responds to those requests from the RM community to learn about the 2017 update and how to apply it.

Widening the scope of the article, they approached Sharon Byrch, RM professional, who agreed to collaborate in writing this article. This article discusses the major changes made to the CAN/CGSB-72.34-2017 standard compared to the 2005 version from the perspectives of the RIM, IT and legal. While the primary focus is on significant changes to content, there are recommendations for implementing the standard into your RM and information governance (IG) programs.

In this article, the views of the authors are their own personal views and are not affiliated with any other organization. As well, the information provided in this article is for information purposes only and not for the purposes of providing legal, technical or other professional advice.

The purpose of the CAN/CGSB-72.34-2017 standard is to set out principles and procedures for organizations to use for managing their electronic records to enhance their admissibility as evidence in legal proceedings. The CAN/CGSB-72.34-2017 is a voluntary standard, available for purchase in paper and electronic formats and in both English and French. The CAN/CGSB-72.34- 2017 is not a substitute for expert legal and technical advice.

The CGSB is a Canadian federal government organization that assists with the development of standards in many industries. Currently, there are over 300 CGSB standards. CGSB standards are developed by a committee of volunteers. These volunteers are experts in their fields. The CGSB is accredited by the Standards Council of Canada (SCC) as a Standards Development Organization. The SCC is a federal Crown corporation. It has its mandate to promote efficient and effective standardization in Canada. Additionally, the CAN/CGSB-72.34-2017 is a National Standard of Canada, as evidenced by the inclusion of “CAN” as part of the standard’s reference number indicating it is recognized as the official Canadian standard in a particular subject.1

When reading the standard, users should be cognizant of CGSB’s language used for organizations to achieve compliance. Take the word “shall”, often used in CAN/CGSB-72.34- 2017. “Shall” denotes mandatory requirements. For example, “An organization shall establish the RM program”2. This is a must statement. That is, organizations using this standard must have an authorized RM program in place. The word “should” refers to recommendations while “may” articulates an option or what is permissible within the limits of this standard.

In addition to demonstrating the integrity, authenticity and reliability of electronic records to meet evidential requirements, organizations need to ensure the electronic records they create, receive and maintain in electronic recordkeeping systems are in compliance with the CAN/CGSB-72.34-2017 to ensure their admissibility in court proceedings. As noted:

This standard provides a framework and guidelines for the implementation and operation of records systems for electronic records, whether or not any information held therein will ever be required as evidence. Thus compliance with it should be regarded as a demonstration of responsible business management. Applying the standard to an organization’s business will not eliminate the possibility of litigation, but the probability is that it will make the production of electronic records easier and their acceptance in a legal procedure more certain.3

One change that occurred with the CAN/CGSB-72.34-2017 update is that the CGSB has withdrawn a companion standard: CAN/CGSB-72.11-93 Microfilm and Electronic Images as Documentary Evidence. This withdrawal is effective January 24, 2017.4 The logic underlying the withdrawal of CAN/CGSB-72.11-93 is due to its limited use and support for its revision. As well, section 3 of Part III and section 3 of Part IV about electronic images in CAN/CGSB 72.11-93 are incorporated into CAN/CGSB-72.34-2017. Organizations can still use and reference CAN/CGSB- 72.11-93, but it not recommended since CAN/CGSB-72.11-93 no longer carries the weight or is supported by the CGSB. As a result, organizations should use the most current version of the standard, the CAN/CGSB-72.34-2017.

 

Legal Requirements For Electronic Records As Documentary Evidence

Section 5 of the CAN/CGSB-72.34-2017 sets out the legal requirements for electronic records as documentary evidence.

Like the 2005 version, the new 2017 standard focuses on these legal requirements for legal admissibility:

[U]se of an electronic record as evidence requires proof of the authenticity of the record, which can be inferred from the integrity of the electronic records system in which the record is made or received or stored, and proof that the record was “made in the usual and ordinary course of business” or is otherwise exempt from the legal rule barring hearsay.5

The result is that the 2017 and 2005 versions are substantially similar so organizations using the 2005 standard can amend their policies, procedures and workflow to comply with the new CAN/CGSB-72.34-2017 without having to start over. This is helpful.

The CAN/CGSB-72.34-2017 refers consistently to the Canada Evidence Act (CEA) and does not mention provincial or territorial statutes. Organizations that do not come under Canadian federal jurisdiction will need to review the applicable provincial or territorial law that applies to them for their electronic records.

In addition, organizations can make use of other available legal sources to understand how courts and tribunals across Canada are interpreting these legal requirements. A useful legal information source is the website of the Canadian Legal Information Institute: https://www.canlii.org/. CANLII, a non-governmental, non-profit organization, which provides free Internet access to: court judgments, tribunal decisions, statutes and regulations from all jurisdictions across Canada. CANLII employs an easy to use search interface. Its legal information is available in both English and French. CANLII is one of the best sources for lawyers and the general public to consult Canadian law.

Section 5 of the CAN/CGSB-72.34-2017 rightly emphasizes that organizations must prove that their electronic records are legally admissible. Two key evidentiary tools for organizations to use is the: (1) RM manual and (2) IT system management guide. These tools should contain the organization’s authorized policies and procedures to provide documentary evidence that the organization has meet the legal requirements for legal admissibility of its electronic records.

CAN/CGSB-72.34-2017 has new, welcome additions to reflect changes in technology from 2005:

  • Electronic discovery (e-discovery);
  • Technology Assisted Review (TAR) using specific search software;
  • Legal hold; and
  • Electronic and wet ink

 

Electronic Discovery (E-Discovery) And Litigation Preparedness

In section 5.3, for the first time, CAN/CGSB-72.34-2017 refers to e-discovery in civil litigation. E- discovery is a pre-trial procedure where parties to the litigation are required to exchange relevant electronic records as supervised by the courts. As organizations are now creating and managing electronic records in greater numbers, the challenge for them is being able to search, access and then produce relevant electronic records required by Canadian courts so that these electronic records can be admissible as evidence in the legal proceedings.

In 2008, to assist organizations in producing admissible electronic records in Canada, The Sedona Canada Principles Addressing Electronic Discovery (Sedona Canada Principles) were developed. Sedona Canada is a non-governmental volunteer organization of lawyers, judges and technologists across Canada. The Sedona Canada Principles are largely a voluntary standard like the CGSB, except for Ontario. Since 2010, the Sedona Canada Principles are mandatory for use in Ontario courts under the Ontario Rules of Civil Procedure. Outside of Ontario, the Sedona Canada Principles are acceptable for use in courts and tribunals across the country to assist parties to produce and access relevant electronic records in a timely and efficient, cost- effective manner. Currently, there is a developing Canadian e-discovery case law around the Sedona Canada Principles.

In 2015, like the changes CGSB made in 2017 to CAN/CGSB-72.34-2005, Sedona Canada, to account for changes in technology and the law, issued an updated Second Edition of the Sedona Canada Principles.6

As well, for the first time, for disclosure in criminal law proceedings, the CAN/CGSB-72.34-2017 refers to R. v. Oler, 2014 ABPC 130 (CanLII).7 The Oler case is a leading case for several reasons. First, Oler is the first case to consider and apply the 2005 CAN/CGSB-72.34 standard. Second, Oler is the first case in Canada to provide that an organization’s scanning process is admissible as electronic evidence because its scanning process complies with both the Canada Evidence Act and the Alberta Evidence Act for its use of industry standards. Third, Oler is the first court in Canada to accept records management expert evidence. The expert in Oler was Uta Fox, co- author of this article. Uta Fox is the first records manager known in Canada to be accepted as a court qualified expert witness. In that role, she assisted the court in determining the legal admissibility of electronic records.

There is little case law in Canada on the relationship between RM and the law. More case law is needed to advance both the RIM profession and the admissibility of records, especially electronic records in Canadian courts. This situation becomes more critical as organizations become increasingly ‘digital’ and only create and manage electronic records, not paper records. Given the preference for Canadian courts to cite with approved National Standards of Canada like CAN/CGSB-72.34-20, the Oler case is a step in the right direction for RIM.

As well, there is an increasing need for RM professionals to be accepted in court as experts, like Uta Fox was, to assist the court in determining the admissibility of electronic evidence.

Canadian courts do not qualify persons as experts lightly. Courts need proof that the person has proper qualifications, education, specialized knowledge and expertise to be an expert.

Furthermore, the person must be independent and impartial to assist the court as an expert.8 Organizations, like the Calgary Police Service in Oler, that use electronic document and records management systems (EDRMS) are operating and managing complex RM and IT systems, with legal implications beyond the experience of the public or the casual user. As more organizations move to EDRMS, there needs to be more RIM experts admitted by courts to advise on how these complex systems work to proof admissibility of electronic records in both civil and criminal cases. Oler is the first case law in Canada that permits the admissibility of scanned electronic records in place of paper originals using an EDRMS. We need to develop the RM case law to match pace with the use of EDRMS in business.

From a legal perspective, these are welcomed additions to the standard. Organizations can refer to the Sedona Principles and the Oler case in addition to CGSB 72.34-2017 as sources for credible best practice to further enhance the organization’s legal compliance and reduce the organization’s legal risks.

 

Technology Assisted Review (TAR) And Other Automated Tools And Techniques

The 2005 standard is silent on Technology Assisted Review (TAR). It is first introduced in section 5.3.1, for CAN/CGSB-72.34-2017. TAR uses technology to automate the process of identifying relevant electronic records. Like e-discovery, CAN/CGSB-72.34-2017 rightly refers to the Sedona Canada Principles. Organizations are increasingly using TAR and it is being referred to in Canadian case law, so this is another welcome addition.

 

Legal Hold

Section 5.4, CAN/CGSB-72.34-2017 refers to a “legal hold” which was absent from the 2005 standard. Increasingly applied to all records, a legal hold is a process where an organization preserves all forms of potentially relevant records when litigation is reasonably anticipated or underway. The risk to an organization for not preserving electronic evidence with a legal hold is that relevant evidence is inadvertently destroyed and thus unavailable to the courts to determine liability or not. When an organization inadvertently destroys important evidence, this increases the organization’s risk to legal sanctions by the courts who are unhappy with the destruction of evidence as a tool to subvert the rule of law.

CAN/CGSB-72.34-2017 refers to the Sedona Canada Principles on legal holds. Taken together, CAN/CGSB-72.34-2017 and the Sedona Canada Principles provide organizations with a good understanding of what a legal hold is, how to apply it and how to avoid the risk of inadvertently destroying or spoliating relevant electronic evidence. Both standards highlight the need for expert legal advice as soon as possible when a legal hold is needed.

 

Signatures

For signatures, the 2005 standard does not have a specific section for electronic and person signatures. The 2005 standard makes only passing reference to them.

The CAN/CGSB-72.34-2017 dedicates section 5.5 to electronic and wet signatures. A wet signature is usually ink, a paper seal or even the traditional wax. The distinction between electronic and wet signatures is useful. Many organizations today use both wet and electronic signatures; there is often confusion and uncertainty in organizations about the legal admissibility of these two signatures. CAN/CGSB-72.34-2017 provides the basic requirements for use of these signatures and when to seek expert advice to ensure legal admissibility.

 

Authenticated Paper Copies For Legal Proceedings (“True Copies”)

For “True copies”, the 2005 standard in section 5.6 required paper copies to be authenticated as a “true copy” of the original with a signature to enhance its admissibility and weight in court. That authentication needed to be documented with procedures.

In the 2017 version, section 5.6 is substantially similarly in wording to the 2005 standard.

The main difference is that the 2005 standard does not mention “electronic records”, it is implied; CAN/CGSB-72.34-2017 uses the phrase “electronic records” to make it clear that paper copies of electronic records need to be authenticated by signature as “true copies” of the electronic records.

CAN/CGSB-72.34-2017 also adds that an affidavit can be used as the authentication record; the 2005 standard does not mention the affidavit. The new reference to using an affidavit is helpful since the affidavit is a common tool of evidence in the litigation process.

 

Records Management (RM) program

Both versions of CAN/CGSB-72.34 have provisions on establishing a RM program. The 2005 version, titled “Establishing a Records management system (RMS) program” changed in 2017 to the “Records Management (RM) program.”

Before discussing RM changes in CAN/CGSB-72.34-2017, let’s review what’s carried forward from 2005 since it is vital to your RM program. Sections 6.1 to 6.4.1 of the CAN/CGSB-72.34- 2017 address the fundamental components an organization must have in terms of the: records management concepts, principles, methods and practices adopted by the organization shall demonstrate that an appropriate RM program is in place and is an integral part of the organization’s usual and ordinary course of business.9

Your organization must have an authorized:

  • RM program;
  • RM policy;
  • RM manual; and
  • records officer (RO)10

In section 6.1, the RM program itself “shall support a records system consisting of appropriate records procedures and controls that complement business operating procedures.”11 An organization shall:

  • establish the RM program;
  • develop a RM policy, with definitions and assignment of responsibilities;
  • design RM procedures and related documentation;
  • select and implement technologies supporting the records system;
  • establish records protection measures, including audit trails and backup; and
  • establish a records quality assurance12

Notice, these statements are “shall” statements; meaning, the organization must support records systems using suitable records procedures and controls that harmonize business operations.

Organizations must establish a RM program, have a policy, use definitions, assign responsibilities, develop procedures, select technologies, implement records protection processes and adopt a quality assurance program. These are non-negotiable factors that must be in place before an organization can even begin to comply with this standard.

 

Responsibility – Records Officer (RO)

Both the 2005 and the 2017 versions require organizations to establish a RO to be responsible for implementing the RM program, ensuring it is part of the organization’s usual and ordinary course of business. Additionally, the RO must assure that the organization implements the RM policy, program and RM manual. The RO’s role and responsibilities must be defined in policy, bylaw or directive as this role is central to establishing and maintaining the RM program and providing oversight to ensure organizational compliance with standards, policy and legislation impacting the RM program.

CAN/CGSB-72.34-2017 provides more direction to the RO by acknowledging that the organization’s policy must make the RO responsible for maintaining and amending the RM manual, for the RM program and the records system functions; these responsibilities must be fully supported by IT personnel. Not only must the RO’s responsibilities be documented in the RM program but also, all personnel with RM duties must also be defined for compliance monitoring.13 That directive extends to all employees of an organization that have any RM duties, not just the staff of the RM department. For compliance with CGSB 72.34-2017, make sure your organization addresses the RM requirements and duties of all its employees.

 

RM Policy

Both CGSB versions emphasize that senior management must authorize a RM policy that stipulates the management of electronic records is an integral component of the organization’s usual and ordinary course of business. Further, the policy needs to identify and address:

  • applicable records, records systems and exclusions (scope);
  • relevant RM and IT standards used in RM programs;
  • the RO’s responsibility for the records system;
  • compliance of the record system with RM manual, the law, national, international and industry standards;
  • that the RO is responsible for maintaining and updating the RM manual;
  • requirements for records creation, management, use and disposition;
  • that IT works with RO to integrate RM into organization’s usual and ordinary course of business; and
  • the processes of quality assurance processes is assigned to the14

 

RM Manual

The RM manual is the instrument that consolidates all records related procedures to ensure completeness and consistency of practice15 which both versions of CAN/CGSB-72.34 address. The required components of a RM manual, for example, procedures for making, receiving, capturing and disposing records are also consistent between the versions. CAN/CGSB-72.34- 2017 emphasizes that the RM manual:

shall be kept up-to-date and accurately reflect the exact nature, functions, procedures and processes of the organization’s records system, i.e., the way in which this system participates in and supports the usual and ordinary course of business;16

CAN/CGSB-72.34-2017 notes that the manual must specify the operation and use of records systems and include reference to other applicable documentation, for example, the IT System Management Guide, business procedures or IT system documentation.17 This is another example of the standard compelling IT and RM to collaborate in complying with requirements.

While RM communicates its requirements differently from IT, clearly these two fields have shared interests and the same end goal: better information governance over electronic records management. It’s completely strategic for RM and IT to support each other because helping one field advances the goals and objectives of the other and ultimately, both. IT and RM are essentially different expressions of the same objective. Your stakeholders overlap, you share the same risks and consequences, resources and priorities.

The future of electronic records management depends on its successful implementation and integration within organizations. The inner workings between IT and RM can make or break this work and it’s up to you both to build capacity to onboard this strategic management methodology. The sooner you figure out how to work together and develop resources, communications and support to advance your work, the better off you are.

Keeping the manual current using a formal review process at pre-determined times ensures the RM manual continues to reflect the business of RM for electronic records in real time.18

Your RM policy, program, manual and RO combined formulate the foundation and fundamental aspects of your RM operations. Organizations cannot be compliant with CAN/CGSB-72.34-2017 if they do not have a RM policy, program, manual or a RO in place if called upon to produce their electronic records for court purposes. As CAN/CGSB-72.34-2017 states compliance with this standard “should be regarded as a demonstration of responsible business management.”19

 

Digitization

CAN/CGSB-72.34-2017 incorporates the fundamental elements of the Microfilm and electronic images as documentary evidence (CAN/CGSB-72.34-11-93). If you are implementing an imaging program into your RM program, you can use the following checklist itemizing CAN/CGSB-72.34- 2017’s requirements20, described in Table I below. If your organization has an existing imaging program, use the checklist to audit your program using CAN/CGSB-72.34-2017’s digitization requirements to your imaging program.

This checklist provides the “shall” requirements directly captured from standard, see the “CAN/CGSB-72.34-2017 Requirement” column below. It also identifies the requirements that must be addressed and actioned, using the “Action Required” column. Lastly, the “Responsibility” column is populated with those positions responsible for completing the actions. This checklist is a recommendation only but we encourage you to develop a tracking system of some type to document the status of your implementation and its progress.

 

Table I

Digitization Checklist – 6.4.4.4 Digitization

Records Retention Requirements

A number of modifications were made to the records retention section by CAN/CGSB-72.34- 2017. The 2005 version addressed consumer, goods and services requirements in assigning retention periods while the 2017 version adopts a general business approach. CAN/CGSB-72.34- 2017 stipulates that to assign proper retention periods, authorized individuals responsible for the organizational functions that the records support (e.g., legal representation, finance, HR) must be included in records appraisal and retention decisions. Ideally, organizations should establish a RM committee, composed of subject matter experts from legal, finance, HR and the like to be responsible for these decisions. Decisions made by the authorizing committee assigning retention periods for record series must be documented as part of the Records Retention Schedule and linked to the classification system.

The standard provides several critical appraisal factors that can ultimately define retention requirements including:

  • How the organization uses the records, internally and externally;
  • Users’ need for access if a disaster occurs;
  • Financial, legal, social, political, historical value of the records;
  • Costs/benefits analysis of retention;
  • Impact on the organization if records are destroyed; and
  • Evidentiary value of records if required for litigation, audit or22

Other changes in the 2017 standard focus on the organization’s records system and its ability to accommodate fixed-date, event date and permanent retention. The RO must review all records dispositions before any dispositioning takes place to ensure that records scheduled for disposal are not subject to a legal hold, organizational or government review.23 If they are, the records must be suspended from destruction as outlined in section 5.4 Legal hold of the CAN/CGSB- 72.34-2017.

 

Records Disposition

There are some significant changes to the disposition section in CAN/CGSB-72.34-2017. The 2005 version recognized disposition that included records destruction and transfer to another entity but there was no discussion on transferring records to another body or on records preservation.

CAN/CGSB-72.34-2017 rectifies that by providing substantial details on the disposition process, the destruction of electronic records, transferring records to another entity and preservation practices for those records that are permanently retained or have long-term retention requirements.

 

Destruction of Electronic Records

As mentioned, both versions of the standard acknowledge that records must first meet retention requirements and be authorized for disposition/destruction prior to commencing any dispositioning actions. CAN/CGSB-72.34-2017 specifies that the RM manual must allow the records system to destroy, amend or correct records using an editable process. For those records that have been destroyed, the system procedures must ensure both the record and its locator are destroyed. It also stipulates that the records destruction process must be completed in such a manner that the confidentiality and protection of personal information is not compromised.24

 

Transfer of Electronic Records to Another Entity

In section 6.4.6.4 of the CAN/CGSB-72.34-2017, an organization’s RM manual must document all records transferred to and accepted by the archives (or another entity) and both the organization transferring the records and the organization receiving the records must maintain this documentation. The receiving organization may require additional information such as: the identity of the hardware and software that generated the records, the program documentation describing the format, file codes, file layout and other technical details about the records system in which the records resided.25

 

Records Preservation

Section 6.4.6.5 of the CAN/CGSB-72.34-2017 recognizes that organizations must ensure their RM manuals stress that records preservation begins with records created and maintained in preserve-able file formats capturing required identity and recordkeeping metadata. Throughout the records lifecycle organizations must demonstrate that electronic records were created, received and stored in their usual and ordinary course of business and continue to meet admissibility requirements by ensuring their authenticity.

Organizations required to maintain records permanently must make certain that their recordkeeping systems are capable of permanently maintaining recordsdefinitely a consideration when selecting records systems as is the safeguarding of records against software obsolescence.26

 

Records Conversion And Migration

Section 6.4.6.5.1 of the CAN/CGSB-72.34-2017 defines records conversion as changing recorded information from one format to another, while migration is moving recorded information from one IT system to another (both are methods of avoiding software obsolescence and both involve risks). The standard notes that there are two types of digital record obsolescence:

  1. file format obsolescence – when the software cannot open or view the records contents. File format obsolescence is addressed by file conversion or moving a record from one file format to another;
  2. systems obsolescence – when a system or application is no longer supported and records cannot be retrieved. System obsolescence is addressed by migrating digital files to a new27

Organizations must have a conversion and migration policy. Additionally, this policy must be documented in your RM manual and include the procedures developed that ensure the records’ structure, content, identity and recordkeeping metadata are protected and preserved. This protection and preservation must extend to all electronic records, including emails, their attachments, links, proof of delivery, distribution lists and relationship to other records of the organization.

However, before any conversion or migration activity commences the standard stipulates that the organization must identify the following:

  • the required functionalities of the old format;
  • the functionalities that have to be maintained in the new format and system; and,
  • document all

Most importantly, both migration and conversion must be integrated in a documented business process that is part of the regular operation of the records system.28<s/up>

 

Preservation Formats

Section 6.4.6.5.2 of the CAN/CGSB-72.34-2017 directs organizations to identify the preferred formats for records preservation by type of record. That is, what type of record will be preserved and in what format. Preservation format decisions must be based on the amount of change that can be introduced before the record becomes too degraded to serve as a reliable copy of the record in a legal proceeding. Further, preservation format decisions should be documented in the RM manual. Annex C, on preservation, provides valuable information on preservation formats.29

 

Quality Assurance

Both the 2005 and 2017 versions require organizations to incorporate quality assurance measures and processes into their RM programs.

Quality assurance means the appropriate level of service is defined and that staff are aware and understand that level of service. As well, it requires that staff have received the necessary training to provide the defined level of service.

Implementing the quality assurance processes is the responsibility of the RO and must include performance, compliance monitoring, self-assessments, external audits, incident handling, documenting and certifying that all RM duties are fulfilled. Significant issues must be reported to senior management for RM program modifications, if required.30

 

IT Elements

Given the current push in many organizations to become digital, organizations are adopting technologies more quickly than their capacity to incorporate appropriate protocols into systems, technologies planning and management. This leads to growing information chaos and increased compliance risks. Particularly, as records accumulate and spread across systems, spaces, services, devices, formats and in the custody of workers. Forbes and other industry research experts, currently estimate that electronic information doubles annually.31 Meanwhile, of the total volume of records (hardcopy and electronic), many RM practitioners believe that roughly 5-10% will require permanent retention. RM practitioners in Sharon Byrch’s experience consistently agree that anywhere between 60%-80% of an organization’s electronic information is unwanted cluttertransitory records with short-term or no enduring value—including duplicates, convenience copies, outdated versions and incomplete or abandoned work; or records which have expired and are no longer necessary. These records could be routinely and legally destroyed if they were easily identifiable from the records which are still required for business.

IT professionals know information clutter is costly. Data storage requirements and associated costs keep rising as organizations continue expanding their digital footprint. Moreover, sorting through information clutter is time-consuming and costly for IT and RM and impacts workers; clutter makes it difficult for workers to identify transitory records from official business records when looking for authoritative, trustworthy information to do their jobs. It’s a lose-lose situation and a motivating factor for change and implementing CAN/CGSB-72.34-2017 for better electronic records management, increased legal compliance and reduced risk.

At the same time as organizations are going ‘digital’ to run their businesses, paper hardcopy (analogue) records are becoming legacy documentation. For example, Sharon Byrch has experienced in one organization she worked for, that the organization had a ratio of roughly 1:1000 hardcopy to electronic records, meaning that for everyone 1 hardcopy record created, the organization created 1000 electronic records. This ratio is a natural progression by virtue of the decline in the creation of hardcopy records and is common in many organizations.

Quite honestly, if organizations are not ensuring that their records are captured and made easily available for use electronically, or managed appropriately as records (including for legal, privacy and security purposes) and consequently, do not meet admissibility requirements for legal proceedingsthose organizations are wasting their precious resources, time and money.

Given the current rate of information growth and spread experienced by organizations in all industries, realistically organizations can’t expect to catch up either. To overcome this situation, organizations need to re-think their approach to electronic records management.

 

IT System Management Guide

An important change for IT is that the CAN/CGSB-72.34-2017 updated all references from “system management guide”, the key deliverable and responsibility of IT, to “IT system management guide”. The first instance appears in section 6.5.1 IT system management guide. Section 6.5.1 outlines IT’s overall directive for CAN/CGSB-72.34-2017:

All significant details of the logical and physical architecture of the IT system keeping the records shall be fully documented in the IT system management guide, including the responsibilities and the relationships between IT system management, the RM program, and the conduct of the organization’s business. The IT system management guide shall be structured so that the integrity of the system can be demonstrated for any point in time.32

Meeting this IT directive is essential to successfully make the legal case for records admissibility as documentary evidence in court proceedings. It is the point of this standard from an IT perspective and is directly linked to the Canada Evidence Act in section 31.2(1)(a) so that:

The best evidence rule in respect of an electronic document is satisfied on proof of the integrity of the electronic documents system by or in which the electronic document was recorded or stored.33

The key takeaway here for IT is that IT system management documentation is how to demonstrate IT system integrity for managing electronic records and then having them meet admissibility requirements as evidence. The critical aspect is that the integrity of electronic records is completely dependent on the integrity of the system(s) managing them; this is best described and documented by IT staff who are systems experts and primarily responsible for their technical management. This is the core way IT supports the legal RM requirements for managing electronic records to ensure their future admissibility in court proceedings.

It is important to note that it is mandated that IT will keep its IT systems manual current as evidence of compliance, same as RM is mandated to with its RM manual. Though practically speaking, these documents are under continuous revision. It is a best practice for organizations to make sure they retain previous versions as evidence of their compliance.

While the requirement to have a system management guide has not changed since the 2005 standard, CAN/CGSB-72.34-2017 provides better clarity between IT and RM. This is particularly true now that the IT system management guide topics are better separated from the RM procedure manual topics found under section 6.4 Manual. This makes it less confusing to the user. IT and RM professionals will find it interesting to look at the division of responsibilities between these sections.

 

Audit Trail

In section 6.5.5 of CAN/CGSB-72.34-2017, while the technical requirements for audit trails are laid out as an IT responsibility, audit trails are a key priority for RM and legal departments.

Consequently, RM and legal are key stakeholders for IT; they are concerned with the evidentiary requirements being captured and proven by audit trails. It is wise for IT to include these two stakeholders in IT’s management process.

Here is some advice for IT about working with RM on audit trails: RM is particularly well-suited to audit how well a system’s audit trail is capturing and recording the required aspects for evidence. Since RM works closely with the business owners of the records to establish what records require capture and management as evidence of their associated business activities. As a result, IT should be familiar with the technologies involved. Moreover, RM determines the lifecycle of records and their management requirements through a process of detailed appraisal, and will know what the purpose(s), use(s) and legal requirements of the records are, their users, required format(s), how long to keep records and what to do with them once they are no longer required for business.

RM’s intimate knowledge of the organization’s records collections, their users and their management, including technologies involved, makes them a huge asset to IT. This is particularly important when IT is contemplating, testing and implementing system feature changes, new releases, upgrades or new technologies which could impact audit trails in ways unfamiliar, unanticipated or undetectable by IT. As a best practice, IT should involve RM early in the process and give RIM time to evaluate and test for possible risks and issues that may need mitigation and documenting for future reference.

 

New Technologies

In section 7 of CAN/CGSB-72.34-2017, for the first time are sections regarding

  • 2 Cloud computing;
  • 3 Social media; and
  • 4 Mobile devices, including, bring your own device (BYOD) and bring your own cloud (BYOC).

In CAN/CGSB-72.34-2017, each of these technologies also has its own annex before the bibliography. However, more importantly, before discussing these technology topics, section

  • broadly identifies the need to conduct a risk assessment prior to adopting new

 

Risk Assessment

In section 7.1 of CAN/CGSB-72.34-2017, as one might expect, it is recommended that a multi- disciplinary approach be taken for conducting risk assessments. This is done by developing a team of key stakeholders representing records, legal, security, privacy, IT and risk management. It is important for organizations to determine, if their service providers could or *should* be among these stakeholders.

Depending on your organization’s legal requirements, privacy and security policies and risk management process, organizations may be required to involve service providers in conducting risk assessments. This is particularly so for public bodies or organizations which provide services to public bodies, who may be required under the applicable freedom of information statute for their jurisdiction, to treat service providers and their agents and/or subcontractors as if they are public bodies. In cases like this, public bodies are ultimately accountable for the activities of their service providers. This includes: information access, disclosure, storage management and jurisdictional location, security, backups/disaster recovery and privacy breach management.

Typically, in cases like this, a formal Privacy Impact Assessment (PIA) process is legislated under freedom of information legislation. As well, there are tools are in place for documenting the risk assessment. This process can be capitalized on for CAN/CGSB-72.34-2017’s purposes.

Using a multi-disciplinary approach for conducting risk assessments is necessary to: 1) fully examine the benefits versus risks of implementing new technologies and 2) develop a solid business case for their implementation or abandonment. The end-result of conducting a risk assessment process is the creation of a valuable information asset and tool that serves multi- purposes:

  • informs communications to advise senior management/decision-makers of risks, threats and benefits;
  • informs the development of new policies and procedures for risk mitigation and management where required;
  • establishes a re-usable process and benchmarks the new technology for future development and proposals; and
  • serves as necessary chain of custody documentation to evidence the considerations, decisions, activities and subsequent activities related to the risk assessment process and the technology’s implementation or

It is a recommended best practice for organizations to develop a template that they can re- purpose for other risk assessment projects. In this way, organizations can develop a consistent approach and documentation. As well, they can evaluate and update their templates as needed to ensure compliance. At the same time, organizations should but retain previous versions for historical reference.

While not directly mentioned in CAN/CGSB-72.34-2017, organizations can use two International Organization for Standardization (ISO) standards to further strengthen their IT systems. The first ISO standard is ISO/TR 18128:2014, Information and documentation — Risk assessment for records processes and systems. ISO/TR 18128:2014 can be used by organizations to assess risks to records processes and systems in order for organizations to ensure they can meet identified business needs. The second ISO standard is the ISO 15489-1:2016, Information and documentation — Records management — Part 1: Concepts and principles. ISO 15489-1:2016 defines the concepts and principles that organizations can use regarding the creation, capture and management of records are developed, including electronic records.

Here is some final advice about service providers, regardless whether or not your organization is legislated to involve service providers in your risk assessment processes. The practical truth is, if service providers are involved in capturing, storing, maintaining or sharing an organization’s records, they have a role in the risk assessment and should be included as a best practice. Organizations will need to understand, evaluate and document their system architecture and management protocols. These protocols include: security management, data storage management, troubleshooting and technical management. Then the organization needs to use these protocols in order to: (1) detect any risks or threats that need mitigation and communicating, (2) establish benchmarks for future development and (3) create the required chain of custody documentation for the risk assessment process.

Another important aspect about risk assessments, while not explicitly stated in CAN/CGSB- 72.34-2017, it is recommended organizations tackle their risk assessments early in the process of considering new technology. As well, you should make sure you involve your key stakeholders, particularly IT, RM, security, privacy and legal. Depending on your situation, you may need to involve external stakeholders. For example, in British Columbia, New Brunswick and Nova Scotia, public bodies are required to provide their privacy impact assessments to the government and/or their respective Information and Privacy Commissioner for review.34 External review can result in serious impacts to project timelines, scope and delivery so organizations should build the risk assessment into each technology project’s requirements and timelines during early phases. The phases are for new technology or changes to existing technology. They can include: upgrades, add-ons and new modules or integration with other systems. Remember that changes to existing technology can be equally risky as implementing new technologies, depending on the scope of change involved.

 

Cloud Computing

Section 7.2 of CAN/CGSB-72.34-2017 generally describes cloud computing. Annex D gives more detailed information on cloud computing. CAN/CGSB-72.34-2017 advises that cloud computing, if used by an organization, should be documented in the IT system management guide and RM manual.

 

Social Media

Section 7.3 of CAN/CGSB-72.34-2017 generally describes social media, including: identification, author/ownership, context, reliability, accuracy and authenticity and chain of custody. To address these issues, organizations are advised to adopt (“shall”) a social media policy that provides for the lifecycle management of postings. Posting are considered records. Annex E gives more information about social media. Like cloud computing, if an organization uses social media, that use should be documented in the IT system management guide and RM manual.

 

Mobile Devices

Section 7.4 of CAN/CGSB-72.34-2017 generally describes mobile devices, like BYOD and BYOC. Annex F discusses COBO (Corporate Owned Business Only) devices or COPE (Corporate Owned Personally Enabled) devices. IT professionals will find section 7.4 and Annex F particularly useful since they give a great overview of the legal issues and implications and the need for policies, risk assessments, procedures, tight security and so forth. Mobile Device Management (MDM) or Enterprise Mobility Management (EMM) solutions are also touched on.

 

Annexes

The 6 Annexes of CAN/CGSB-72.34-2017 are additions. They contain “should” statements only. These 6 Annexes are:

  1. Annex A: Sources for this
  2. Annex B:
  3. Annex C: Preservation
  4. Annex D: Cloud
  5. Annex E: Social
  6. Annex F: Mobile

The risk assessment for BYOD in Annex F is extremely valuable and recommended reading for all stakeholders. Annex F provides a great overview of key risk assessment areas across stakeholders, including: security, redundancy, auditability, data-ownership, chain of custody, managing information, retention and legal hold, disposition of information asset, data integrity, employee termination and revenue loss. Thinking about risk management like this will be applicable in other situations as well that an organization faces.

 

Bibliography

The Bibliography of CAN/CGSB-72.34-2017 lists 18 sources. Of the 18 sources listed, 5 are IT standards. Of those, 2 are also found in the 2005 version:

  • CAN/CSA-ISO/IEC 11179-3-04, Information Technology — Metadata Registries (MDR) — Part 3: Registry Metamodel and Basic Attributes; and
  • CAN/CSA-ISO/IEC 14662-01, Information Technology — Open-EDI Reference

In 2010, the CAN/CSA-ISO/IEC 14662-01 was revised and replaced by CAN/CSA-ISO/IEC 14662- 10 (R2015). CAN/CSA-ISO/IEC 14662-10 (R2015) is still current.

In 2013, the CAN/CSA-ISO/IEC 11179-3-04 was revised and replaced by ISO/IEC 11179-3:2013. ISO/IEC 11179-3:2013 is still current.

The 3 new IT standards introduced in CAN/CGSB-72.34-2017 on security are:

  • ISO/IEC 27001 Information technology – Security techniques – Information management systems – Requirements;
  • ISO/IEC 27002 Information technology – Security techniques – Code of practice for security controls; and
  • ISO/IEC 27005 Information technology – Security techniques – Information security risk management.

Meanwhile, RM sources increased from 3 to 12 in 2017; 5 ISO standards for RM were introduced, for a total of 7 ISO standards for RM. ARMA International had 4 technical reports and guidelines.

The Uniform Electronic Evidence Act (UEEA), which also had its own annex in the 2005 CGSB version, was dropped by CAN/CGSB-72.34-2017. In CAN/CGSB-72.34-2017, a new source, the Sedona Canada Principles Addressing Electronic Discovery (Second Edition) was added. The changes in sources used from the 2005 to CAN/CGSB-72.34-2017, clearly support the general shift in vocabulary and language from IT to RM that has occurred in business in general.

Organizations may wish to invest in acquiring these 5 IT standards, along with the 12 RM source standards, technical reports and guidelines mentioned in CAN/CGSB-72.34-2017’s bibliography to add to their overall corporate body of knowledge for managing electronic records. At the time of this writing, the CEA, PIPEDA and Sedona Canada are freely available online; those interested in cybersecurity, privacy and data protection law may find these legal sources valuable.

 

Conclusion: Get CAN/CGSB-72.34-2017 and Implement It!

All organizations should consider investing in implementing CAN/CGSB-72.34-2017 as a clear demonstration of their commitment to preserving the trustworthiness of their records and as ensuring legally admissible evidence from their responsible business management. IT and RM are the technical leads for implementing CAN/CGSB-72.34-2017. As a result, IT and RM need to develop a solid, integrated approach. Improved corporate knowledge management, decision- making and delivery of information services are some of the major benefits that are important to staff in their everyday work. These are important wins for organizations for achieving your long-range targets: reduced information risks and better compliance.

Applying CAN/CGSB-72.34-2017 can provide real practical benefit for organizations. CAN/CGSB- 72.34-2017 can help to:

  • increase legal compliance;
  • ensure legal admissibility of electronic records;
  • reduce electronic information chaos and clutter;
  • increase capturing and managing a ‘single source of truth’ and making it available electronically for use; and
  • protect information from unauthorized access, disclosure, modifications, use and deletion/destruction.

Our final major point, implementing appropriate standards, systems and procedures for electronic records management improves information governance. It helps organizations create a stable pool of authoritative information that is legally compliant, trustworthy and secureand ready to be optimized and leveraged by systems and technologies. The end-result is a win-win situation for everyone. You can reduce information risk and improve compliance while staff and management can achieve better results with less effort for the information work they are required to do. That, legal, RM and IT friends, is the golden ticket and what we aspire to do.

END

 

Works Cited

1“Standards Council of Canada,” What is a National Standard of Canada (NSC)?, https://www.scc.ca/en/faq-what- is-a-national-standard-of-canada, accessed on August 24, 2017.
2Section 6.1 General, page 14.
3Section 0.1 About this standard, page iv.
4See https://www.tpsgc-pwgsc.gc.ca/ongc-cgsb/programme-program/normes-standards/notification/sect1b- eng.html.
5Section 5.2 Requirements for admissibility of electronic records as documentary evidence, page 9.
6See https://thesedonaconference.org/publication/The%20Sedona%20Canada%20Principles.
7See http://canlii.ca/t/g7n43.
8The leading test for Canadian courts to qualify experts is White Burgess Langille Inman v. Abbott and Haliburton Co., 2015 SCC 23 (CanLII), http://canlii.ca/t/ghd4f
9Section 6.1 General, page 14.
10Ibid.
11Ibid.
12Ibid.
14Ibid.13 Section 6.3.2 Content of the policy, pages 15-16.
15Section 6.4.1 General, page 16.
16Ibid.
17Ibid.
18Ibid.
19Section 0.1 About this standard, page iv.
20Section 6.4.2.2 Digitization, page 17.
22Section 6.4.5 Records retention requirements, page 18.
24Section 6.4.6.3 Destruction of electronic records, page 19.
25Section 6.4.6.4 Transfer of electronic records to another entity, page 19.
26Section 6.4.6.5 Records preservation, page 19.
27Section 6.4.6.5.1 Records conversion and migration, page 20.
28Ibid.
29Ibid.
30Section 6.4.7 Quality assurance, page 20.
31http://www.forbes.com/sites/gilpress/2013/05/28/a-very-short-history-of-data-science/#7f0fa8cb69fd
32Page 20.
33R.S.C. 1985, c. C-5, http://canlii.ca/t/52zlc
34See sections 69 and 69.1 of the Freedom Of Information And Protection Of Privacy Act, R.S.B.C. 1996, c. 165 (http://canlii.ca/t/52sth), section 77 of the Right to Information and Protection of Privacy Act, S.N.B. 2009, c. R-10.6 (http://canlii.ca/t/52wbq) and section 72 of the Access to Information and Protection of Privacy Act, 2015, S.N.L. 2015, c. A-1.2 (http://canlii.ca/t/52wbq).

Metadata in Recent Canadian Case Law

By Joy Rowe

 

1.0  Introduction

Metadata underlies every part of modern records management, from the digital records captured in an electronic document and records management system (EDRMS), stored in shared drives to the automatic logs being constantly generated by the computer system. The filename entered before saving a new document, the date and time that reflects the last instance the document was opened, the formulas and scripts linking together different portions of a document such as notes, bibliography and text and the countless “prior edits” automatically registered as this article was being prepared are all part of the hundreds of pieces of metadata that are associated with a typical digital record. Metadata reflects both the traces of human interactions with electronic systems as well as the trails of countless automated computer activities that require no human intervention. As a ubiquitous part of the digital realm, metadata is a key element of the records and information we manage.1

Archivists, concerned with the permanent retention of select digital records, have explored the roles of metadata in the multiple tasks of preservation2. Records managers, involved in the creation and numerous business uses of digital records, have focused on the role of metadata for retrieval functions3. Both professions within the information management field have also considered metadata as part of the legal uses of records. This includes the importance of good metadata for timely, accurate identification of records to facilitate the retrieval of records during the legal discovery and document production processes. It also includes exploring the legal understandings of authentication and integrity of the digital recordkeeping system4.

Many discussions of metadata in the legal context tend to center around the discovery process. This article looks at metadata in the Canadian legal context with a focus on admissibility5. Its purpose is to guide records and information professionals in the multiple metadata tasks undertaken in the course of their work where the legal uses of records and information need to be considered. These may include, but are not limited to:

  • establishing meaningful and useful metadata profile in electronic document or record systems;
  • communicating important legal distinctions of digital information creation methods to IT or other colleagues, and;
  • using basic metadata type classification to correctly characterize electronic information as potential legal evidence for admissibility

Drawing largely from recent case law rulings, this article examines how metadata is treated as electronic evidence in the Canadian legal system. Case law in the area of metadata and electronic evidence in general has expanded greatly in the last ten years. Concentrating on rulings related to admissibility, the leading cases on electronic records6 are presented, including those related to the business records exception to the hearsay rule, best evidence rule and weighting of evidence, use of standards in authentication, and computer- generated versus human-generated electronic records.

 

2.0 Structure of the article

A historical review of archival and records management literature, beginning with the legal aspects of traditional records and ending with recent work on metadata, is discussed. Then, an overview of the Canadian legal system is given, including how statutory law and case law work together. Next, the main types of evidence – comprising real, documentary, and demonstrative evidence – are presented. Metadata – which include system, substantive, and embedded metadata – are then explored and are shown to be part of real evidence as well as documentary evidence. Finally, the leading Canadian case rulings related to metadata are examined.

 

3.0  Perspectives on records and the law over time

Archival and records management literature has explored the legal aspects of records from multiple perspectives, including how to maintain records as trustworthy potential legal evidence, the uses of metadata for electronic document discovery, and various typologies to describe and explain metadata. The following section is a review of literature to assist records and information management (RIM) professionals in understanding the foundational knowledge of archival science as well as recent evolutions in the archival and RIM fields.

 

3.1  Authenticity of documents

Archivists and records managers are keenly aware of the need to manage records as potential evidence. As explained in Sheppard’s 1984 article, “Records and Archives in Court,” archivists need to understand the criteria for admitting records in court as evidence7, including those of authentication, best evidence, and hearsay rules of admissibility. These key aspects of documentary evidence are deeply embedded in archival science; indeed, central archival concepts of authenticity, integrity, and reliability are built on and derived from Western common and civil law systems.

Roman law held records to be trustworthy because they were kept in inviolable places by trusted custodians such as an archives. The spread of Roman law and the credibility it attributed to records led to widespread forgery. To control for improper and untrustworthy records, specific rules were created that are at the root of basic evidence in common law systems: the best evidence rule and the authentication rule8.

Forgery became a common problem, to the point that specific rules had to be introduced to prevent it, such as a requirement of great formality in the creation and structuring of the original record, and a requirement of authentication by experts whenever a record was offered as proof of a fact at issue9. This adaptation of the law to the circumstances of the times is at the root of the two basic rules of evidence in common law: 1) best evidence rule, which gives a preference for an original document when a document is submitted as evidence and 2) the authentication rule, which requires that to adduce a document there must be evidence provided that the document is what it purports to be10.

The methods of authenticating records have changed over time. Archival documents were authenticated by their chain of custody under Roman law. Sir Hilary Jenkinson believed that records were “authenticated by the fact of their official preservation11.” To him, records’ history of legitimate custody was a sufficient guarantee that a record was trustworthy. Another key archival scholar challenged this faith in the chain of custody and stated that records must be tested for indications of their authenticity through studying their provenance and elements of their form (diplomatics)12. Authenticity may be defined as “the quality of a record that is what it purports to be and that is free from tampering or corruption13.” In current Canadian statutory law, authenticating electronic documents is a burden placed on the party adducing the document, who must provide “evidence capable of supporting a finding that the electronic document is that which it is purported to be14” or, stated another way, “capable of supporting a finding of authenticity.”

Archival science and diplomatics provide a model of “record” and a way to define the authenticity of a record. A record is defined as a document made or received in the course of practical activity and set aside for future action or reference15. To establish its authenticity, the record’s integrity and identity must be demonstrated. InterPARES 3 project defines integrity as “the quality of being complete and unaltered in all essential respects” while identity is “the whole of the characteristics of a document or a record that uniquely identify it and distinguish it from any other document or record. With integrity, [identity is] a component of authenticity16.”17

For records created in a digital environment, authenticity is inferred from evidence on how the records have been created and maintained18. In common law legal systems, documentary evidence must be authenticated to be admissible at trial. Authenticity, established through processes of authentication, is codified in our legal systems through statute and common law. Authentication of documentary evidence is accomplished through witness testimony, expert analysis, non-expert opinion, or, in the case of public documents or other special types, circumstances of record creation and preservation19.

 

3.2  Metadata and electronic documents

In records management (RM) literature , metadata plays many roles over the record life cycle. The Information Governance Maturity Model, which incorporates the Principles® from ARMA International, explicitly cites the role of metadata in an effective information governance system, including the higher levels of principles such as integrity and compliance. Metadata assists in ensuring that information generated has reasonable and suitable guarantees of authenticity and integrity (Integrity Principle) and that an information governance program complies with laws and organizational policies (Compliance Principle)20. The Principles® are a well-known guide for records professionals seeking to assess and improve how records and information are created, managed, and used in organizations21.

Ensuring timely, efficient and accurate retrieval of records, including for legal discovery, is another frequently discussed concern that metadata addresses. Early treatments of metadata for legal purposes tended to draw analogies to analog records. Metadata was initially described in RIM literature using bibliographic schemas, such as the three-way distinction between descriptive, structural, and administrative metadata used widely in the library and archives sphere. This typology of metadata put the focus on availability and retrieval. Descriptive metadata describes a resource for purposes such as discovery and identification. Structural metadata indicates how compound objects are put together.

Administrative metadata provides information to help manage a resource25. This library- informed conception of metadata simply extended the typical tasks of bibliographic retrieval to include legal discovery.

A mid-2000s RIM report also used paper documents as an analogy to describe metadata when considering records as legal evidence. Writing on electronic records as evidence, Mason distinguished between two types of metadata. He used analog document examples to explain the metadata associated with electronic documents using a binary distinction between explicit and implicit metadata. He described explicit metadata as metadata that comes from perusing the paper itself, “such as the title of the document, the date, the name of the person that wrote it, who received it and where the document is located.” Implicit metadata was described as automatically originating from the application software, or being supplied by the records creation. Examples of implicit metadata include “types of type used, such as bold, underline or italic; perhaps the document is located in a coloured file to denote a particular type of document, or labels on file folders26.”

A little over ten years ago, it was common to think of metadata through the lens of a paper document world and to apply existing knowledge on bibliographic and library cataloging systems to explain metadata. While analogies to paper or analog documents are useful for describing the relatively new concept of electronic metadata, it arguably also limits the discussion on metadata drastically. It is certainly useful to know where to look for metadata to assist with a certain task, such as discovery, or to understand that some metadata may be hidden or implicit. However, larger questions such as who creates metadata and how to consider it in legal proceedings are left unanswered. Indeed, courts and the legal community were finding Electronically Stored Information (ESI) and metadata to be less straightforward than previous paper records.

Sedona Canada, coming after the US Sedona Conference Principles of 2007, explicitly acknowledged that ESI was a different animal from analog records. The working group opened its best practices document on electronic discovery in 2008 by explicitly stating: “ESI behaves completely differently from paper documents.”… [It] “can be mishandled in ways that are unknown in the world of paper27.” They also noted that ESI may be created, copied, and distributed without active human involvement28. Unlike paper documents that record or reflect human statements or intentions, electronic information is not necessarily generated as the result of human activity.

As a consequence, the courts and judges seemed unsure on whether computer-generated information such as metadata had anything in common with documents or documentary evidence. R. v. Hall (1998) was an early case in Canada that recognized that computer information was generated with no direct human intervention was admissible as real, rather than documentary evidence. In the mid-2000s, courts on both sides of the border were considering how to treat “hidden” or “embedded” information (i.e. metadata). Courts applied federal and provincial/state rules of court30 and determined with Williams v.

Sprint/United Mgmt Co. (2007)31 in the US and Hummingbird v. Mustafa (2007) in Canada32 that metadata is part of electronic documents and should be produced as part of discovery. In considering discovery questions, the Hummingbird v. Mustafa case ruling determined that metadata should be produced but it also noted that not all metadata will be useful for understanding a document.

The Sedona Principles of the US also concentrated on discovery and production, and they outlined best practices on handling electronic documents. Regarding metadata, the US Sedona Principles stated that metadata can be embedded in the document or stored external to the document on the computer’s file system33. The primary distinction made in US Sedona Principles was between two types of metadata: application and system.

Application metadata is “embedded in the file it describes and moves with the file when it is moved or copied.” System metadata, on the other hand, is stored externally. “System metadata is used by the computer’s file system to track file locations and store information about each file’s name, size, creation, modification, and usage34 35.” This description of metadata moved away from a paper-oriented understanding of documents and shifted focus to the different metadata-generating layers of the computer system.

A different articulation of metadata was elaborated in an ARMA International Education Foundation (AIEF)-commissioned report on “Metadata in Court: What RIM, Legal, and IT Need to Know)36 Isaza described three types of metadata: substantive, system, and embedded. This typology was taken from the American case Aguilar v. Immigration Customs Enforcement Division (2008):

  1. Substantive metadata is application-based that were not necessarily intended for adversaries to see. Examples of substantive metadata include document author edits, reviewer comments,
  2. System-based metadata includes information automatically captured by the computer system. Examples of system-based metadata include author name, date and time of creation, and timestamps of
  3. Embedded metadata consists of text, numbers, and content that is directly input but not necessarily visible on output. Examples of embedded metadata include spreadsheet formulas or hyperlinks to other documents, charts37

While Isaza’s report was on discovery and spoliation and not on issues of admissibility, the metadata distinctions that he included are useful beyond their purposes for discovery.

System metadata (e.g. file names and extensions, sizes, creation dates) is most relevant for authentication purposes. This data could be crucial to authenticating the record for any purpose where authenticity of the record is paramount, like archival or historical uses38 39.

Compared to the role of metadata in assisting with timely discovery, there is much more limited discussion in RIM literature on metadata and its role in legal cases, including the role of metadata in establishing the authenticity of records for admissibility purposes. This may be due to the fact that very few records professionals are called upon as expert witnesses to authenticate digital records before a trier of fact in a court case. In a transnational survey of records professionals, only about 5% of 293 respondents had ever been required to authenticate digital records before a court during the course of their career. In follow-up interviews, some respondents commented that they were more likely to be involved in the production phase, while IT professionals were likely to be called to act as experts of the system40.

In Canadian court cases, this experience also holds true. Records professionals are infrequently called upon to authenticate digital records or the systems in which they reside. A recent exception includes R. v. Oler. This case is discussed further in 5.3: Use of standards in authentication.

Contributions to RIM literature from records professionals with direct experience in authenticating digital records or testifying to the integrity of their records system would be meaningful and useful to the records management community. In the meantime, recent court rulings on electronic evidence give important insight into how metadata is being addressed and considered in today’s Canadian courts.

 

4.0  Legal foundations

Having explored the contributions of archival and records scholars on the topic of the legal uses of records and metadata, it is also important to understand the legal context of how Canadian legal decisions are made. The following section is intended to outline the basic elements of the Canadian legal framework for RIM professionals who may not be experts in case law. The main sources of law are explained, including the distinctions between federal and provincial Evidence Acts. The different types of evidence as well as admissibility rules are described. RIM professionals have a particular contribution to make in understanding and correctly characterizing how the ESI was created. Understanding the wider context of law can greatly assist RIM professionals in ensuring that the appropriate rules of evidence are applied when and if they are called upon to assist in a legal trial.

 

4.1  Statutory and case law

As is the case in most common law jurisdictions, evidence law in Canada comes primarily from common law, which does not vary among jurisdictions. Each court applies essentially the same evidence law, but each jurisdiction also has evidence acts (statutory law) that may modify or supplement common evidence law41 Two main sources of law are 1) enacted statutes and 2) cases adjudicated by judges in courts of law. Legislative bodies, including the Parliament of Canada, the ten provincial legislatures, and the three territories, are granted the legislative authority to enact statutes42 These sovereign legislative bodies may delegate authority to a subordinate body to create rules, as Ontario’s Courts of Justice Act delegates rule-making authority to the Civil Rules Committee43. Rules of courts are therefore a type of subordinate legislation and are a part of statutory law.

In addition to statutory law, case law is the second major source of law. Case law comes from the written decisions of courts on particular matters. Judges’ decisions, and the stated reasons for those decisions, serve as precedence for future courts deciding on similar situations. Courts are bound to follow precedent cases, and the body of case law develops to give guidance to judges as they decide on contemporary matters44.

Statutes and regulations, such as court Rules of Civil Procedure, operate together to form a code of procedural law; while they work in tandem, the majority of procedural law in Canadian Superior Courts is set out by the Rules rather than by any statute45. The Civil Rules Committee has delegated legislative authority to make rules in relation to the practice and procedure of the Court of Appeals and Superior Court in all civil proceedings for the relevant jurisdiction (federal, provincial)46.

The rules of court may not conflict with what is stated in an Act, such as the Evidence or Interpretation Act. However, they may supplement what is provided in the Act by further articulating practice and procedure47. Rules of court give guidelines to judges and lawyers on how things shall be done during a trial and codify expectations and common understandings.

Federal matters and criminal cases are tried under the Canada Evidence Act, and each province and territory has its own evidence act for civil and provincial matters48. Issues specific to electronic evidence have been largely coordinated among the jurisdictions through the Uniform Electronic Evidence Act49. Through the adoption of Uniform Electronic Evidence Act provisions, the focus was shifted away from satisfying authentication and best evidence of a particular record. Instead, the focus moved to inferring trustworthiness from the integrity of the electronic system. The adducing party is required to show that the system was operating as expected when the record was created. If this is done, the evidentiary requirements are satisfied50.

 

4.2  Types of evidence

Evidence is submitted in the form of either oral testimony by witnesses or as real evidence51. As established by statutory law and case law, evidence may be of three main types: real evidence, documentary evidence, and demonstrative evidence. The first type, real evidence, speaks for itself. The trier of fact (judge, jury, or other adjudicator) can use their own reasoning to understand real evidence and can draw conclusions based on what is presented. In a murder trial, a gun registered to the accused may be real evidence52.

The second type, documentary evidence, is evidence that is offered for the truth of the statement it contains53. It is not the fact that the evidence is in a document form which makes it documentary evidence. The crucial characteristic of documentary evidence is that it is “a recording … of an out-of-court statement of fact made by a person who is not called as a witness, that is tendered for the truth of the statement it contains54.”

The third type, demonstrative evidence, is evidence that is not strictly material to the case but which can help the trier of fact interpret the material facts55 Digital animations or other computer-generated images are examples of demonstrative evidence but they tend to require expert testimony to accompany them. Tests for expert opinion rules will usually apply, including cost-benefit analysis of whether the demonstrative evidence helps the jury or trier of fact to understand the evidence without creating a prejudicial effect and distorting the fact finding process56. Demonstrative evidence can in turn be testimonial (e.g. expert witness), documentary (e.g. medical illustration and photographs of an injury), or real (e.g. videotaped re-enactments of events to contextualize witness testimony)57.

A leading case on electronic records acknowledged that metadata may complicate these boundaries of evidence. In Saturley v. CIBC World Markets, Justice Wood stated, “It is possible that a given item of electronic information may have aspects of both real and documentary evidence58.” An email, for instance, has automatically generated metadata that records when the message was sent and from which computer. These “statements” are not made by a person who could be called as a witness and cannot be properly called documentary evidence. The email also has statements made by human declarants in its contents such as the message text itself. As such, an email has components that may be either real or documentary evidence.

 

4.3  Admissibility rules

The basic principles of evidence at trial include admissibility, weight, and standards of proof. These apply regardless of type of evidence. Admissibility and weight are formally kept separate. Admissibility is the judge or court’s determination that the evidence can be presented during the trial. The judge (or trier of law) determines if the evidence is relevant to a material fact in the case59. The item must have some tendency to make the existence of a fact in the case more or less probable. Admitted evidence becomes part of the body of evidence that goes before the trier of fact (either a judge or jury) who must, at the end of the trial, determine the “weight” of evidence. This includes determinations of which parts are accepted and which are rejected based on the standard of proof; for civil cases, the party admitting the evidence must prove that the evidence is true based on the balance of probabilities, while for criminal cases, the measure used is beyond reasonable doubt60.

In order to be admitted as evidence in court, potential evidence must pass various tests of admissibility. Some admissibility criteria and appropriate rules of evidence apply to all types of evidence. For example, rules of materiality and relevance apply without regard to evidence type61. Evidence must assist the trier of fact in drawing the necessary inferences (weight), and it must be relevant to the issues or events in question (relevance).

Another admissibility rule, the best evidence rule, says that if there are two records with the same content, then the one that is the “original” will be accepted. Electronic documents can fulfill the best evidence rule if the adducing party provides evidence that speaks to the “integrity of the electronic documents system by and in which the electronic document was records or stored62.” Proof of integrity can be established through proof that the storage medium (e.g. hard drive, operating system) was operating properly, proof that the document was recorded or stored by the other (“adverse”) party63, or proof that the document was recorded or stored in the ordinary course of business64. Integrity can be proven by an affidavit or expert evidence65. Integrity of the system can also be proven by providing evidence that current standards, procedures, and practices were adhered to66. Finally, a print out of an electronic document that was “manifestly or consistently acted upon” as a record, can be deemed as best evidence67.

Other rules of admissibility differ according to the type of evidence considered68. The hearsay rule and its exceptions, for instance, only apply to statements that are made by humans. The hearsay rule prevents statements by a person who cannot be examined as a witness from becoming evidence in court.

Authentication of analog documents often involves having the author confirm that they created it or having a witness confirm that they saw the creation of the document.

Authentication of electronic evidence focuses on the metadata associated with the documents or on the metadata traces left behind when the document was created. The process of creating, saving and editing an electronic document leaves evidence fragments in the form of metadata that can be used to authenticate electronic evidence69.

 

4.4  ESI as real and documentary evidence

ESI may be admitted as real, rather than documentary, evidence. Underwood and Penner note that while the distinction between ESI as real or documentary evidence has been recognized in English and American courts, Canadian treatment of ESI as real evidence has been less uniform. They note that Canadian counsel and courts to date (i.e. 2010) have failed to properly characterize ESI as real or documentary at the outset of proceedings, leading to a failure in applying the appropriate rules of evidence to determine admissibility70. Properly characterized ESI entails an understanding of both how it was created and the purpose for which it is tendered. While legal counsel may ably address the latter question, it is in the realm of records and information management professionals to understand the former and to correctly characterize how the ESI was created so that the appropriate rules of evidence can be applied.

Evidence may be real evidence either for the purpose it is submitted or for the manner in which it was created. If either of the following conditions apply, the evidence should be submitted as real evidence:

  1. the ESI is not tendered for the purpose of the truth of its content, but rather for the fact that it existed or was found in the possession of a person;
  2. the information embodied in the ESI does not depend upon the interposition of a human observer or recorder between the external event and the ESI that captures the information about the event. This ESI results from an automated process, or is computer-generated71

ESI, metadata associated with electronic documents, or analog documents can all be admitted as real evidence when their purpose is to serve as proof of the fact of their own existence rather than for the purpose of providing the truth of the statements they contain72. An email and its associated metadata may be entered as real evidence if they are tendered as proof that the sender sent the communication at a certain time, or to link the statements contained in the email to the author73. However, if the email is submitted because of the contents of the message, it would be adduced as documentary evidence74. The purpose the evidence is intended to serve is more important than the form the evidence takes.

Canadian courts tended to treat ESI that is automatically generated in same fashion as documentary evidence and admit them under the exception to the hearsay rule, rather than admit them as real evidence75. However, since the publication of Underwood and Penner, several Canadian courts’ rulings, including the influential Saturley v. World CIBC Markets, have cited their work and have made finer distinctions between ESI as real or documentary evidence76. Saturley cited this useful passage from Underwood and Penner’s Electronic Evidence in Canada to explain the distinction:

A record that is created by a computer system whose function it is to capture information about cellular telephone calls would be introduced as real evidence, and the record could be relied upon for the truth of its contents without resort to any exception to the hearsay rule. Because the information that is captured (the date, time and duration of the call, for example) is recorded automatically without being filtered through a human observer, the condition for real ESI evidence is satisfied. It is also important to note that the information recorded is itself not an out-of-court statement by a human declarant, but rather it consists of objective information that is captured and recorded by an automated process.

On the other hand, if a record is created by a human sitting at a computer keyboard and entering data, the ESI embodied in the record could not be tendered as real evidence if it is offered for the truth of its contents, and its proponent would have to bring the record within the ambit of an exception to the rule against hearsay. The record would be documentary evidence, and subject to the same limitations as would apply to a conventional document. The information contained in the record has been filtered through a human observer, and the ESI reflects the human declarant’s out-of-court statements concerning what he or she observed, heard or did. It is not real evidence77.

When ESI is not properly characterized as either real or documentary, the unstated assumption is that ESI is documentary and, consequently, hearsay78. However, if ESI is automatically generated or may otherwise be characterized as real evidence, the evidentiary rules relating to documentary evidence need not be applied79 80.

Metadata is one of the most important sources of real evidence81. Some, but not all, metadata is automatically generated and, if properly characterized, need not be admitted as documentary evidence and need not meet the admissibility criteria for hearsay exceptions.

ESI as real evidence must have evidence to establish the authenticity and threshold reliability established to be admissible82. Threshold reliability exists on a spectrum, from “utterly unreliable to highly reliable.” The proponent offering the evidence must satisfy the judge that the information is, on the balance of probabilities, sufficiently reliable. It need not be error-free. Like documentary evidence, such matters of potential of error go to weight, not admissibility83.

The type of metadata can assist in correctly characterizing the evidence under consideration. A useful typology of metadata from the US Aguilar case was elaborated to assist in making evidence type distinctions. The metadata types of system, substantive, and embedded are created with varying levels of human input84.

System metadata is an automatic capture of data from the system that created the electronic document and is a type of real evidence that includes the date and time the file was created, as well as details on when the file was modified or viewed. System metadata does not record any out-of-court statements made by human declarants and is therefore not documentary evidence. This type of metadata changes readily and may be intentionally or unintentionally altered. To pass admissibility tests of reliability and authenticity of real evidence, system metadata often must be subjected to expert forensic analysis85.

Substantive metadata is also called application metadata, because it is generated by the application software. It captures information about the creation and editing history of the electronic document. This type of metadata shows prior edits or modifications to the file and includes formatting display rules like font type and line spacing. Depending on the type of information it captures and how it will be put to use, substantive metadata may be either documentary or real evidence86. When the metadata is captured automatically and used to show that a comment or text edit has been included, the metadata is usually considered real evidence. When the metadata is being admitted to prove the truth of the content of a comment or text edit, the substantive metadata would be considered documentary evidence instead87.

Embedded metadata is information that is part of the electronic document because the user placed it there. Examples include spreadsheet formulas, hyperlinks to other documents, a video clip included in a PowerPoint presentation, or even hidden cells or linking relationships in a database. Embedded metadata is rarely automatically generated by the computer system or application software; as such, embedded metadata is most frequently considered documentary evidence88.

 

5.0  Metadata and case law

Like digital information itself, case rulings on ESI and metadata have grown drastically as courts and judges have become more accustomed to admitting and weighing electronic evidence. RIM professionals can benefit from understanding trends of the courts such as the “principled approach” to the hearsay rule, the precedent-setting use of RIM professionals and Canadian and international standards in the authentication of evidence, and the potential for automatically-generated metadata to be adduced as real evidence.

Below, recent relevant case law has been conveniently grouped by theme or finding as it relates to the RIM profession.

Federal, provincial and territorial case law has gradually been building a body of rulings concerning different types of metadata and types of ESI evidence. In the last ten years, there has been a marked increase in cases that consider questions of electronic evidence. Between the years of 1996 – 2006, nineteen cases dealt with electronic records89. In the last ten years from 2007 – 2017, 262 cases have considered ESI, metadata or computer evidence. The treatment of metadata has become more firmly rooted in Canadian case law. This section looks at the leading cases on electronic records under the business records exception to the hearsay rule, best evidence and weight, use of standards in authentication, and computer-generated versus human-generated electronic records.

 

5.1  Business records as hearsay exception

A recent and leading case in British Columbia systematically applied the federal Evidence Act to several documents that had been previously ruled inadmissible by a summary trial judge in a case of an insurance claim for disability payments. Certain portions of the presented evidence were ruled inadmissible as business records exceptions to the hearsay rule; in McGarry v. Co-operators Life Insurance90, the court reconsidered the ruling and specifically looked at attachments to a witness testimony, which are both paper documents such as printed policies and quotes and as well as emails.

The majority of the British Columbia Court of Appeal went through the exhibits and used the “principled approach” to the hearsay rule cited in RS II Film Distribution v. BC Trade Development Corp91. to consider reliability. Specifically, indicators of reliability include whether the source of the information, preparer of the document, and creation date are known, and whether there is anything on the document itself that indicates it was made in the “ordinary course of business92“. The majority of the BC Court of Appeal then went through each document previously ruled inadmissible as a business record exception, they applied s.42 of the Evidence Act93 to the affidavit and its accompanying documents, and they concluded which data establish that the record is reliable (chiefly, the date it was created and whether one of the businesses involved in the case produced the record, as indicated by signatures, document headers, etc.).

The majority of the BC Court of Appeal addressed the question of emails as business records, noting that the Evidence Act of British Columbia does not discuss electronic records or provide criteria for considering them as evidence. They noted that other provinces do, however, and other provincial Evidence Acts94 outline specific concerns regarding authenticity and requirements that the submitting party provide evidence of the integrity of the electronic document system. They noted that since the emails in this case were deemed inadmissible, along with the other documentary evidence, these tests of authenticity and integrity were not performed or considered95. The admissibility of the emails cannot be determined, because these issues were not addressed in the trial case.

However, the majority determined that because all of the evidence in this case was documentary, it was feasible and practical to make a fresh assessment of the evidence, including the evidence (such as the emails) previously declared inadmissible by the summary trial judge96. The decision thereby implicitly determined that the emails are documentary evidence, despite the barriers to determining their authenticity or the integrity of the electronic document system they were created on.

This case is largely cited in other cases for the finding in paragraph 81 that if all the evidence is documentary, it is feasible for an appeal court to make a fresh assessment of the evidence in the case97.

Other cases on the topic of admitting evidence under burden of s.31 or with evidence of authenticity, integrity and identity include R. v. MacDonald 2016 ABPC 142 at para. 27, R. v J.S.M 2015 312 at para. 53-54, R. v. Soh 2014 NBQB 20 at para. 31-32, R. v. K.M. 2016 NWTSC 36 at 57-60, R. v. Bernard 2016 NSSC 358 at para. 50-58.

 

5.2  Best evidence, low threshold for admissibility, weight, authentication98

The three aspects of the admissibility of documents are the hearsay rule, authentication, and the best evidence rule. Documents must be adduced for the truth of their contents to be classified as hearsay and to be admissible under the business records or other exception.

There must be evidence that the document is what it purports to be (authentication rule), and that it fulfills the preference for the original of that document (best evidence rule)99. These aspects of documentary evidence have been blurred in court rulings, and, due to the generally low threshold in Canadian evidence law admissibility, any potential for error or suspect quality in the reliability of the evidence will usually go to weight, especially in civil cases100.

Several rulings in the last ten years have stated that concerns about completeness and integrity can be dealt with at trial when the trier of fact determines the appropriate weight to give the evidence. Additionally, questions about whether the presented evidence meets the best evidence rule also do not bar the evidence from admission. These questions are also resolved at the end of the trial during the process of weighting the evidence.

Some examples of these rulings are:

BL. v. Saskatchewan (Social Services)

All of these inherent frailties and features of the IEIS [Integrated Electronic Information System] records do not make them any less a record of the act, transaction, occurrence or event. The frailties may go to weight or ultimate reliability but do not exclude the records as a business record101.

R. v. Hamdan

[T]he best evidence rule has been relaxed, in part to address technological changes over the past 50 years; the standard is no longer stringent. Where the original document is not in hand, the best evidence rule does not bar admission of secondary evidence of the document, even where it may be incomplete or inaccurate. Concerns about completeness and integrity can be dealt with at trial when the trier of fact determines the appropriate weight to give the evidence102

Leading Canadian legal scholars also note that the best evidence rule has decreasing value in the modern era. Authentication, the main requirement for admissibility, is a relatively low threshold to achieve. Authors of “The Law of Evidence in Canada” write, “[t]he modern common law, statutory provisions, rules of practice and modern technology have rendered the [best evidence] rule obsolete in most cases and the question is one of weight and not admissibility.”103

Paciocco, the co-author of both Canada’s leading text in evidentiary law as well as a frequently-cited text on electronic evidence,104 notes that if a document is authenticated, it is admissible even though some of the information in the document may be inaccurate. The potential for inaccuracy is a matter of weight, not admissibility. If an electronic document may have been altered or have inaccurate metadata, the potential imprecision should not affect the admissibility of the document105. Admissibility is a threshold test, and judges prefer to admit evidence and then weight it all at the end when credentials and believability of witnesses or experts can be established better106.

Numerous recent cases have drawn on Paciocco’s articulation of the low threshold of admissibility and “antiquated best evidence” statements to admit electronic records and to consider reliability, best evidence, and questions of the integrity of the system as factors in weight at the end of the trial. For example, R. v. C.L. (2017) considered the admissibility of copy-pasted and printed-out copy of an instant messenger (IM) conversation, including the inaccurate time-stamp metadata. As documentary evidence, the judge accepted the printout as best evidence and determines that witness confirmation that it is a true reflection of the original IM conversation is sufficient authentication, noting that the low threshold for admissibility has been met.

The common law imposes a relatively low standard for authentication; all that is needed is “some evidence” to support the conclusion that the thing is what the party presenting it claims it to be. As Paciocco states, for the purposes of admissibility authentication is “nothing more than a threshold test requiring that there be some basis for leaving the evidence to the fact-finder for ultimate evaluation107.”

Similarly, in R. v. Hirsch, the majority of appeals judges denied an appeal of a criminal code conviction based on three claims, including that the documentary evidence had not been authenticated108. This included screen captures of digital photographs on a Facebook page. This court determined that due to the low threshold of authentication, it was acceptable to accept witness testimony that they saw the post or have some other reasonable basis for believing the screen capture to be what it purports to be109.

I am not persuaded by Mr. Hirsch’s arguments on authentication and the related issue of authorship. In my assessment, s. 31.1 of the Canada Evidence Act is a codification of the common law rule of evidence authentication. The provision merely requires the party seeking to adduce an electronic document into evidence to prove that the electronic document is what it purports to be. This may be done through direct or circumstantial evidence…. Quite simply, to authenticate an electronic document, counsel could present it to a witness for identification and, presumably, the witness would articulate some basis for authenticating it as what it purported to be…. That is, while authentication is required, it is not an onerous requirement110.

Other cases on the low threshold for admissibility or the “antiquated” best evidence rule include R. v. Dennis James Oland, 2015 NBQB 245 at para. 43, R. v. Clarke, 2016 ONSC 3564 at para. 53-55, R. v. Avanes et al., 2015 ONCJ 606, and the labour arbitration ruling of Thacker v. Iamaw, District Lodge 140, 2016 CanLII 62600 (BC LA) at para. 52-54.

 

5.3  Use of standards in authentication

Compliance with standards such as the Canadian General Standards Board, CAN/CGSB- 72.34-2017 Electronic records as documentary evidence is one of the ways to determine if electronic evidence is admissible. According to s. 31.5 of the Canada Evidence Act,111 electronic documents that are created or stored in accordance with a standard can be considered authenticated and admissible. This was also recently confirmed in case law through the Voir Dire ruling of R. v. Oler112.

  1. v. Oler is an impaired driving criminal case where the facts must be proven beyond reasonable doubt. In this case, the evidence of the breathalyzer records of the Calgary Police Service were ruled to be admissible. The testimony of their records manager professional provided an expert opinion as to the integrity of the recordkeeping system they were created in. The expert explained that the records were created in accordance with their procedures manual which included security protocols and quality assurance processes; and that the electronic records management system (ERMS) system contained a permanent audit trail. Further, the ERMS is a DoD 5015.02 certified record system. The court ruled that these records were made in the usual and ordinary course of the business and that all evidentiary rules of authenticity, integrity and reliability had been satisfied because they were created in compliance with CAN/CGSB-72.34-2005113. A “live” document that could be updated (was not in a fixed form) was ruled admissible as a business record but the judge noted that weight should consider that it does not meet the same CAN/CGSB-72.34-2005 standards114.

 

5.4  Computer-generated versus human-generated

As noted earlier, courts and judges have struggled with how to treat electronically stored information. A key distinction discussed in numerous cases is the determination of how or by whom the information was generated. If the computer information was generated automatically, the evidence is often – but not always – considered to be real evidence. Real evidence needs to be authenticated, but it does not need to meet best evidence or hearsay exception rules.

If the ESI is not automatically generated and was instead created as the result of human intervention, the evidence is generally adduced as documentary evidence and must meet the relevant admissibility criteria.

Desgagne v. Yuen (2006) was a motion seeking the production of the plaintiff’s home computer hard drive and other devices, including the electronic documents, metadata, and internet browser history stored on them115. The court addressed whether metadata is a document, noting that “[t]he information being sought does not fit the ordinary or intuitive concept of a document, electronic of otherwise116.” The data being considered was an automatically generated record of the user’s activities that was not printed nor used in the ordinary course of any business. In spite of these facts, the court noted that the metadata is “information recorded or stored by means of [a] device and is therefore a document” under the BC provincial Rules of Court under Rule 1(8)117.

In seeking to produce a hard drive, it is not necessary to determine whether the ESI is real or documentary evidence. Instead, the metadata must have probative value and be relevant to the facts of the case. In Desagne, the judge stated that metadata that was automatically generated could be considered a document but denied the production order because the electronic files and metadata was not demonstrably relevant and the usefulness of metadata and browser history do not offset privacy concerns118. This case is widely cited for the relevancy limits it places on production orders for metadata and electronic evidence.

In some cases, it is not entirely straightforward whether the data or metadata was generated by a human or by a computer. The Animal Welfare cases considered the situation of scripts. The defendants119 sought to enter evidence of databases, reports generated from SQL scripts, Excel formatted databases or generated outputs from these databases, printed graphs, and screenshots. Some of this evidence – specifically the Sales data and the Customer data that were extracted from a database using SQL scripts – was offered to the court as real evidence.

The plaintiff argued that the scripts used to generate reports show evidence of human skill and experience and were subject to error, making them subject to hearsay admissibility tests. The judge determined that the tests needed to take place on the data, not the scripts, and stated that scripts retrieve the data but do not change the data120. The data were automatically captured by the computer system and are not out-of-court statements.

Accordingly, only authentication was required for admissibility121.

A later ruling122 revealed crucial details about script creation. During this case, the questionable integrity of the database was discussed, including chain of custody. The case also considered testimonial evidence from the person who created the scripts that extracted the data being entered into evidence. The script writer admitted that she was instructed to continue working on the script until it generated favourable results to the plaintiff, and she was in the employment of the plaintiff123.

Scripts are human-created and can be tampered with to achieve a human desired outcome. In this case, the judge did not re-determine the nature of the evidence but instead used different evidence to reach his final ruling124. Ultimately, concerns on the reliability of the script-generated reports were considered during the weighting of the evidence.

Once metadata or electronic information has been determined to be real evidence, Saturley, supra considered how to sufficiently establish the admissibility of such evidence. The party offering the evidence needs to establish that the information was automatically generated. They need not give evidence that the computer information is error-free. The court held:

The first step in the admissibility analysis is to determine whether the party offering the evidence can establish on a balance of probabilities that it fits within the parameters of real evidence as discussed above. That means, it must be data collected automatically by a computer system without human intervention. It appears that this could include a threshold consideration of reliability; however, it is important to remember that reliability is primarily an issue that goes to the weight to be given the evidence and not its admissibility125.

Other cases finding automatically generated metadata or ESI to be real evidence and subject only to authentication include R. v. Nardi, [2012] BCPC 318 at para. 22, R. v. Soh, [2014] NBQB 20 at para. 31, R. v. Mondor, 2014 ONCJ 135 at para. 17-19. R. v. Bernard (2016) finds that although aspects of the evidence is automatically generated metadata, Facebook screenshots are documentary evidence (and inadmissible due to lack of authentication)126.

 

6.0 Conclusions

Metadata is frequently generated automatically. Indeed, automatic creation and assignment of metadata can be an indicator of a well-developed information governance system127. Computer-generated metadata easily meets the test for real evidence and can be admissible after passing the authentication test. Increasingly, Canadian courts are acknowledging that automatically generated metadata need not fulfill the evidence tests related to documentary evidence.

Records professionals, who have ideally been involved in all stages of the record life cycle, are well-placed to be able to speak on how metadata was created. To properly characterize metadata as legal evidence, RIM professionals should consider using the three-way distinction described in Underwood and Penner: system, substantive, and metadata.

In determining the type of metadata as it relates to evidence, RIM professionals can start the analysis by considering whether the metadata has been created at the system-level of the network. System metadata has multiple legal purposes. Automatically-generated system metadata can be used in authentication during a trial for both documentary and real electronic evidence. System metadata itself is most likely to be considered real, rather than documentary, evidence.

Substantive metadata can be either documentary or real evidence. Automatically captured edit tracking, for instance, may be real evidence. Substantive metadata may also be evidence of human interaction and is likely to be considered documentary evidence.

Information professionals should be ready to provide information to satisfy the admissibility and weighting rules of documentary evidence, including business records exception to the hearsay rule, authentication through metadata, and proof of integrity of the system through expert testimony or other means.

Embedded metadata has caused some confusion in courts, such as the Animal Welfare rulings. Information professionals should be ready to clearly describe how metadata such as scripts or formulas are executed without human intervention. Otherwise, embedded metadata is likely to be considered documentary evidence, and information professionals should be prepared to satisfy the documentary evidence rules.

RIM professionals interact with, create, and manage metadata in numerous tasks during the course of their work. Although the legal uses of metadata are always evolving as case law advances, RIM professionals can benefit from a solid understanding of the Canadian legal framework and recent case law rulings on the topic. Armed with this knowledge, RIM professionals will be in a better position to make decisions about the automatic generation of metadata, how best to set up metadata profiles, and how to communicate important legal distinctions with IT and others on how metadata is used in Canadian courts today.

 

Publication bibliography 

“ABA-LawyersCanSearch Metadata.” Information Management Journal May/June (2007): 11.
Aguilar v. Immigration and Customs Enforcement Div. of U.S. Dept. of Homeland Sec., 255 F.R.D. 350 (S.D.N.Y. Nov. 21, 2008), November 21, 2008.
Alberta Evidence Act, R.S.A. c. A-18, s. 41 (1980), accessed on 2017-10-31, http://canlii.ca/t/52fmm. Animal Welfare International Inc. v. W3 International Media Ltd., 2013 BCSC 2193 (CanLII), accessed on 2017-08-30, http://canlii.ca/t/g23fg.
Animal Welfare International Inc. v. W3 International Media Ltd., 2014 BCSC 1839 (CanLII), accessed on 2017-08-30, http://canlii.ca/t/gds4d.
“Backgrounder: The Honourable Justice David M. Paciocco’s Questionnaire”. Department of Justice, Government of Canada. Last modified on April 07, 2017, https://www.canada.ca/en/department- justice/news/2017/04/the_honourable_justicedavidmpacioccosquestionnaire.html.
BL v Saskatchewan (Social Services), 2012 SKCA 38 (CanLII), accessed on 2017-08-27, http://canlii.ca/t/fqtnj.
Bryant, Alan W., Sidney N. Lederman, Michelle K. Fuerst, and John Sopinka. The law of evidence in Canada.
Markham, Ont: LexisNexis, 2009.
Canada Evidence Act. RSC 1985, c C-5, (1985), accessed on 2017-10-31, http://canlii.ca/t/52zlc. Canadian General Standards Board, CAN/CGSB 72.34-2017 Electronic records as documentary evidence.
Gatineau, Québec: Canadian General Standards Board. 2017, accessed on 2017-10-31, http://publications.gc.ca/collections/collection_2017/ongc-cgsb/P29-072-034-2017-eng.pdf.
Cook, Michael. The Management of Information from Archives. Aldershot, Hants, England, Brookfield, Vt., USA: Gower, 1986.
Currie, Robert, and Steve Coughlan. “Canada (Chapter 9).” In Electronic Evidence, 3rd edition, edited by Stephen Mason. 283–325. Lexis Nexis, 2012. https://ssrn.com/abstract=2324273.
Delisle, RJ. Evidence: principles and problems. 10th ed. Toronto: Carswell, 2010.
Desgagne v. Yuen et al, 2006 BCSC 955 (CanLII), accessed on 2017-09-01, http://canlii.ca/t/1nnpc. Duranti, Luciana. “From digital diplomatics to digital records forensics.” Archivaria 68 (Fall 2009): 39–66.
Duranti, Luciana, and Randy Preston. International research on permanent authentic records in electronic systems (InterPARES) 2: Experiential, interactive and dynamic records. CLEUP, 2008.
Duranti, Luciana, and Corinne Rogers. “Trust in Digital Records: An Increasingly Cloudy Legal Area.” Computer Law Security Review 28, no. 5 (2012): 522–31.
Duranti, Luciana, Corinne Rogers, and Anthony Sheppard. “Electronic Records and the Law of Evidence in Canada: The Uniform Electronic Evidence Act Twelve Years Later.” Archivaria 70, (Fall 2010): 95–124.
Duranti, Luciana, and Kenneth Thibodeau. “The concept of record in interactive, experiential and dynamic environments: the view of InterPARES.” Archival Science 6, no. 1 (2006): 13-68.
Evidence Act, RSO 1990, c E.23, accessed on 2017-10-31, http://canlii.ca/t/l1vl. Evidence Act, SS 2006, c E-11.2, accessed on 2017-10-31, http://canlii.ca/t/52w3h.
Force, Donald C. “From Peruvian Guano to Electronic Records: Canadian E-Discovery and Records Professionals.” Archivaria 69 (2010): 49–75.
Franks, Pat, and Nancy Kunde. “Why METADATA Matters.” Information Management Journal Sept/Oct (2006): 55–61.
Gable, Julie. “Examining metadata: Its role in e-discovery and the future of records managers.” Information Management Journal 43, no. 5 (2009): 28–32.
Gall. GL, FP Eliadis, and F Allard. The Canadian legal system, 5th ed. Scarborough, Ont: Carswell, 2004.
Halsbury’s laws of Canada: Civil Procedure. Markham, Ont.: LexisNexis, 2012. “Information Governance Maturity Model.” ARMA International, 2013.
Hummingbird v. Mustafa, 2007 CanLII 39610 (ON SC), accessed on 2017-08-27, http://canlii.ca/t/1t0tp. InterPARES 3 Project. Terminology database, accessed on 2017-08-20,
http://www.interpares.org/ip3/ip3_terminology_db.cfm.
Isaza, John. “Metadata In Court: What RIM, Legal and IT Need to Know.” ARMA International Education Foundation, Pittsburgh PA, Nov 2010.
Jenkinson, Hilary. A Manual of Archive Administration. New and Revised. London: Percy Lund, Humphries Co. 1937.
Lemieux, Victoria, Brianna Gormly, and Lyse Rowledge. “Meeting Big Data challenges with visual analytics: The role of records management.” Records Management Journal 24, no. 2 (2014): 122-141.
Mancuso, Lara. “Exploring the Potential of Naming Conventions as Metadata.” Independent study paper, University of British Columbia, 2013.
Manitoba Evidence Act, R.S.M., c. E150 (1987), accessed on 2017-10-31, http://canlii.ca/t/lb8k.
Mason, Stephen. “Authentic Digital Records: Laying the Foundation for Evidence.” Information Management Journal 41, no. 5 (Sept/Oct 2007): 32–40. http://www.arma.org/bookstore/files/Mason.pdf.
Mason, Stephen. “Proof of the Authenticity of a Document in Electronic Format Introduced as Evidence.” ARMA International Educational Foundation, Pittsburg, PA, October 2006, accessed on 2017-08-25, https://pdfs.semanticscholar.org/8ae3/d41b6c962d141ddb23640418ecaed6a0fa17.pdf.
McGarry v. Co-operators Life Insurance Co., 2011 BCCA 214 (CanLII), accessed on 2017-08-27, http://canlii.ca/t/fl636.
Owens, Scott. “Best Evidence in Canada: An Analysis of Issues and Case Law” SSHIRC project paper, University of British Columbia, 2014.
Paciocco, D. M. “Proof and progress: Coping with the law of evidence in a technological age.” Canadian Journal of Law and Technology 11, no. 2 (2015): 181–228.
Preston, Randy. “InterPARES 2 Chain of Preservation (COP) Model Metadata” (Draft report). (2009).
RS II Productions Inc. v. B.C. Trade Development Corp., 2000 BCCA 674 (CanLII), accessed on 2017-08-27, http://canlii.ca/t/1fnb1.

  1. v. Avanes et al., 2015 ONCJ 606 (CanLII), accessed on 2017-09-01, http://canlii.ca/t/gltb6.
  2. v Bernard, 2016 NSSC 358 (CanLII), accessed on 2017-08-27, http://canlii.ca/t/h3w44.
  3. v Clarke, 2016 ONSC 3564 (CanLII), accessed on 2017-08-27, http://canlii.ca/t/gsldq
  4. v C.L., 2017 ONSC 3583 (CanLII), accessed on 2017-08-27, http://canlii.ca/t/h4xxp.
  5. v. Hall, 1998 CanLII 3955 (BC SC), accessed on 2017-08-30, http://canlii.ca/t/1f7fx.
  6. v Hamdan, 2017 BCSC 676 (CanLII), accessed on 2017-08-27, http://canlii.ca/t/h3xdd.

R v Hirsch, 2017 SKCA 14 (CanLII), accessed on 2017-08-27, http://canlii.ca/t/gxq03.

  1. v. J.S.M., 2015 NSSC 312 (CanLII), accessed on 2017-08-27, http://canlii.ca/t/gmmml.

R v. K.M., 2016 NWTSC 36 (CanLII), accessed on 2017-08-27, http://canlii.ca/t/gs4mv.

  1. v. MacDonald, 2016 ABPC 142 (CanLII), accessed on 2017-08-27, http://canlii.ca/t/gt5bd.
  2. v. Mondor, 2014 ONCJ 135 (CanLII), accessed on 2017-09-01, http://canlii.ca/t/g6986.
  3. v. Nardi, 2012 BCPC 318 (CanLII), accessed on 2017-09-01, http://canlii.ca/t/fspw2.

R v. Dennis James Oland, 2015 NBQB 245 (CanLII), accessed on 2017-08-27, http://canlii.ca/t/gp3w3.

  1. v. Oler, 2014 ABPC 130 (CanLII), accessed on 2017-08-27, http://canlii.ca/t/g7n43.
  2. v. Starr, 2000 2 SCR 144, 2000 SCC 40 (CanLII), accessed on 2017-08-27, http://canlii.ca/t/525l.

R v Nde Soh, 2014 NBQB 20 (CanLII), accessed on 2017-08-27, http://canlii.ca/t/g50jc.
Rogers, Corinne. “Authenticity of Digital Records: A Survey of Professional Practice.” The Canadian Journal of Information and Library Science 39, no. 2 (2015): 97-113.
Rogers, Corinne, ed. Record Authenticity as a Measure of Trust: A View Across Records Professions, Sectors, and Legal Systems. Croatia: Department of Information and Communication Social Sciences, University of Zagreb (2015): 109-118.
Rogers, Corinne. “Virtual authenticity: authenticity of digital records from theory to practice.” PhD dissertation, University of British Columbia, 2015, accessed on 2017-08-20. https://open.library.ubc.ca/cIRcle/collections/ubctheses/24/items/1.0166169.
Rowe, Joy. “Are you ready to create digital records that last?” Preparing users to transfer records to a digital repository for permanent preservation.” Journal of the South African Society of Archivists 49 (2016): 41- 56, accessed on 2017-10-31, https://www.ajol.info/index.php/jsasa/article/view/138427.
Rowe, J. “Why Manage when you can Govern? Metadata in the Canadian Legal System.” Poster presented at annual meeting of ARMA Canada, Saskatoon, SK. (June 2013), accessed on 2017-10-31, https://open.library.ubc.ca/cIRcle/collections/graduateresearch/42591/items/1.0075797.
Rules of Civil Procedure, RRO (Ontario), Reg 194 — 1.04(1) (1990), accessed on 2017-10-31, http://canlii.ca/t/52zjn.
Sedona Canada. “The Sedona Canada Principles: 2008 CanLIIDocs 1.” The Sedona Conference Working Group 7 (WG7), Montreal, Jan 2008, fourth update (November 19, 2015), http://commentary.canlii.org/w/canlii/2008CanLIIDocs1en.
“The Sedona Principles, Third Edition: Best Practices, Recommendations Principles for Addressing Electronic Document Production: (2017 Public Comment Version).” THE SEDONA CONFERENCE, March 2017. https://thesedonaconference.org/publication/The%20Sedona%20Principle. .
The Sedona Conference Working Group. “THE SEDONA PRINCIPLES: Best Practices Recommendations Principles for Addressing Electronic Document Production.” June 2007.
Sheppard, A. F. “Records and Archives in Court.” Archivaria 19, Winter (1984): 196–203.
Shirley Williams, et al., Plaintiffs, v. Sprint/United Management Company, 464 F. Supp. 2d 1100 (D. Kan. 2006).
Swiss Reinsurance Company v. Camarin Limited, 2015 BCCA 466 (CanLII), accessed on 2017-08-27, http://canlii.ca/t/gm28z.
Thacker v Iamaw, District Lodge 140, 2016 BC LA 62600 (CanLII), accessed on 2017-08-27, http://canlii.ca/t/gtsph.
Underwood, Graham and Jonathon Penner, eds. Electronic Evidence in Canada. Toronto: Carswell, 2010. Williams v. Sprint/United Management Co., 464 F. Supp. 2d 1100 (D. Kan. 2006).
Winstanley v. Winstanley, 2017 BCCA 265 (CanLII), accessed on 2017-08-27, http://canlii.ca/t/h4v5d.

 

Works Cited

1An earlier version of this research was published as a poster presentation and benefitted from the feedback of conference attendees: Joy Rowe, “Why Manage when you can Govern? Metadata in the Canadian Legal System.” Poster presentation, ARMA Canada, Saskatoon, 2013. https://open.library.ubc.ca/cIRcle/collections/graduateresearch/42591/items/1.0075797.
2Luciana Duranti and Randy Preston, International research on permanent authentic records in electronic systems (InterPARES) 2: Experiential, interactive and dynamic records (CLEUP, 2008); Randy Preston, “InterPARES 2 Chain of Preservation (COP) Model Metadata” (Draft report). (2009).
3Donald Force, “Peruvian Guano to Electronic Records: Canadian E-Discovery and Records Professionals,” Archivaria 69(2010); Pat Franks and Nancy Kunde, “Why METADATA Matters,” Information Management Journal (Sept/Oct 2006); Julie Gable, “Examining metadata. Its role in e-discovery and the future of records managers,” Information Management Journal 43, no. 5 (2009); John Isaza, Metadata in Court. What RIM, Legal and IT Need to Know (Pittsburg, PA: ARMA International Education Foundation, 2010); Stephen Mason, “Proof of the Authenticity of a Document in Electronic Format Introduced as Evidence” (Pittsburg, PA: ARMA International Education Foundation, 2006); Stephen Mason, “Authentic Digital Records. Laying the Foundation for Evidence,” Information Management Journal 41, no. 5(2007).
4Luciana Duranti and Corinne Rogers. “Trust in digital records: An increasingly cloudy legal area,” Computer Law Security Review 28, no. 5 (2012); Luciana Duranti. “From digital diplomatics to digital records forensics.” Archivaria 68 (Fall 2009).
5As noted in Force, many sources on legal rulings freely reference both US and Canadian rulings (2010, 54). This paper adopts his stricter view of juridical context and strives to cite only Canadian legal sources, when possible.
6There is a technical distinction between the terms “electronic” and “digital.” Courts have largely chosen the terms “electronic record,” “ESI” (electronically stored information), and “electronic/computer evidence.” Other recent literature tends to use “digital”. In the context of this article, I use the terms ‘electronic record’ and ‘digital record’ interchangeably unless there is a reason to give a distinction. 7 A. F. Sheppard, “Records and Archives in Court.” Archivaria 19 (Winter 1984).
8Luciana Duranti, Corinne Rogers and Anthony Sheppard, “Electronic Records and the Law of Evidence in Canada: The Uniform Electronic Evidence Act Twelve Years Later.” Archivaria 70 (Fall2010):96.
9Duranti, Rogers, and Sheppard, “The Uniform Electronic Evidence Act Twelve Years Later,” 96.
10Robert J. Currie and Steve Coughlan, “Chapter 9, Canada – Electronic Evidence in Canada,” in Electronic Evidence (3rd ed), ed. Stephen Mason (LexisNexis, September 1, 2012), 288.
11Hilary Jenkinson, A manual of archive administration (New and Revised. London: Percy Lund, Humphries Co., 1937): 4.
12Michael Cook, The Management of Information from Archives (Aldershot, Hants, England; Brookfield, Vt., U.S.A: Gower, 1986): 7, 129.
13InterPARES 3 Project Terminology database. http://interpares.org/ip3/ip3_terminology_db.cfm?team=15.
14Currie and Coughlan, “Electronic Evidence in Canada,” 289. Canada Evidence Act, RSC 1985, c C-5, s.31.1 and provincial evidence statutes that have adopted the Uniform Electronic Evidence Act provisions and have similar authentication rules for electronic documents. 15 Corinne Rogers, “Record Authenticity as a Measure of Trust: A View across Records Professions, Sectors, and Legal Systems,” INFuture2015: e-Institutions – Openness, Accessibility, and Preservation (The Future of Information Sciences. Croatia: Department of Information and Communication Social Sciences, University of Zagreb, 2015): 111.
16InterPARES 3 Project Terminology database. http://interpares.org/ip3/ip3_terminology_db.cfm?team=15
17An interesting study exploring the Chain of Preservation (COP) metadata schema to identify the basic units of metadata that could be usefully contained in a file naming convention was done by Lara Mancuso, “Exploring the Potential of Naming Conventions as Metadata,” (independent study project paper, University of British Columbia, 2013).
18Rogers, “Record Authenticity”, 112.
19Rogers, “Record Authenticity”, 97.
20ARMA International, “Information Governance Maturity Model,” (2013).
21Examples of use of The Principles® in assessing organizational records maturity include Joy Rowe,” ‘Are you ready to create digital records that last?’ Preparing users to transfer records to a digital repository for permanent preservation,” Journal of the South African Society of Archivists 49 (2016): 48; Victoria Lemieux, Brianna Gormly, and Lyse Rowledge, “Meeting Big Data challenges with visual analytics: The role of records management,” Records Management Journal 24, no. 2 (2014): 135-136.
25Pat Franks and Nancy Kunde, “Why METADATA Matters,” Information Management Journal (Sept/Oct 2006): 57.
26Stephen Mason, Proof of the Authenticity of a Document in Electronic Format Introduced as Evidence (Pittsburg, PA: ARMA International Education Foundation, 2006): 6-7.
27Sedona Canada, The Sedona Canada Principles. The Sedona Conference Working Group 7 (WG7), (Montreal, 2008, fourth update 2015): 8.
28Ibid., 8.
30Rules of Civil Procedure may relate to how witnesses are examined, how affidavits may be admitted, or how long before a trial another party must be notified to produce documentary evidence, Halsbury’s laws of Canada: Civil Procedure (Markham, Ont: LexisNexis. Halsbury 2012): 289.
31“ABA-Lawyers Can Search Metadata,” Information Management Journal (May/June 2007): 11. Shirley Williams, et al., Plaintiffs, v. Sprint/United Management Company, 464 F. Supp. 2d 1100 (D. Kan. 2006). This ruling followed changes in the US Federal Rules of Civil Procedures, which were amended in 2006 to address issues of electronically stored information, or ESI.
32Hummingbird v. Mustafa, [2007] 39610 ONSC.
33The Sedona Conference Working Group, THE SEDONA PRINCIPLES: Best Practices Recommendations and Principles for Addressing Electronic Document Production, (2007): 4.
34Ibid., 60.
35These same distinctions are maintained in the March 2017 Public Comment version of the proposed updated to the Sedona Principles. THE SEDONA CONFERENCE, The Sedona Principles, Third Edition: Best Practices, Recommendations and Principles for Addressing Electronic Document Production (2017 Public Comment Version).
36John Isaza, Metadata in Court. What RIM, Legal and IT Need to Know (Pittsburg, PA: ARMA International Education Foundation, 2010). 37 This three-way distinction of metadata is frequently attributed to US Sedona Principles. In fact, although the Aguilar case did cite Sedona Principles for some of the discussion of metadata, the terms “substantive, embedded, and system” come from US District Court for the District of Maryland, Suggested Protocol Rules for Discovery of Electronically Stored Information, 25-28. This correction of sourcing is important because a key Canadian legal text, Underwood and Penner, takes up the three-way distinction of metadata and this text is cited in several Canadian cases. The Maryland protocol that Aguilar cited has now been replaced by Principles for the Discovery of ESI in Civil Cases, and it uses the two-part distinction found in US Sedona Principles: system metadata and application metadata. Sedona Commentary on Ethics Metadata notes that the court in Aguilar collapsed some of the seven distinct types of metadata defined in the Sedona Glossary into the three categories of substantive, system-based and embedded (2013, p. 6, footnote 7).
38Julie Gable, “Examining metadata. Its role in e-discovery and the future of records managers,” Information Management Journal 43, no. 5 (2009): 30.
39Isaza, Metadata in Court, 8.
404.8% of the 293 respondents had given testimony in a court proceeding, while 10.2% had been involved in a legal hold or e-discovery. Corinne Rogers, “Virtual authenticity: authenticity of digital records from theory to practice” (PhD diss., University of British Columbia, 2015), 115, https://open.library.ubc.ca/cIRcle/collections/ubctheses/24/items/1.0166169.
41Currie and Coughlan, “Electronic Evidence in Canada,” 284.
42GL Gall, FP Eliadis, F Allard, The Canadian legal system, 5th Ed (Scarborough, Ont: Carswell, 2004), 40.
43Halsbury’s laws of Canada: Civil Procedure (Markham, Ont: LexisNexis, 2012), 289.
44Gall, Eliadis, and Allard, The Canadian legal system, 41.
45Halsbury’s laws of Canada: Civil Procedure, 285.
46Ibid., 286-287.
47Halsbury’s laws of Canada: Civil Procedure, 289.
48Currie and Coughlan, “Electronic Evidence in Canada”, 284.
49Ibid., 285. British Columbia, Newfoundland, and Northwest Territories have not yet adopted a form of the coordinated electronic evidence legislation.
50Duranti, Rogers and Sheppard, “The Uniform Electronic Evidence Act Twelve Years Later,” 104-105.
51Currie and Coughlan, “Electronic Evidence in Canada”, 285. 52 Underwood and Penner, Electronic Evidence in Canada, 12-1. 53 Ibid., 13-7.
54Ibid., 13-1 to 13-2.
55RJ Delisle, Evidence: principles and problems, 10th ed (Toronto: Carswell, 2010), 411.
56Currie and Coughlan, “Electronic Evidence in Canada”, 295.
57Underwood and Penner, Electronic Evidence in Canada, 14-1.
58Saturley v. CIBC World Markets Inc [2012] NSSC 226 at para. 28.
59Currie and Coughlan, “Electronic Evidence in Canada”, 284-285.
60Ibid., 285.
61Underwood and Penner, “Electronic Evidence in Canada”, 11-3, 11-7.
62Canada Evidence Act, RSC 1985, c C-5 (1985), s.31.2 (1).
63Ibid., s.31.3-b.
64Canada Evidence Act, RSC 1985, c C-5, s.31.3-c.
65Ibid., s.31.6(1).
66Ibid., s.31.5.
67Ibid., s.31.2(2).
68Underwood and Penner, “Electronic Evidence in Canada”, 12-1.
69Ibid., 13-8 to 13-9.
70Ibid., 12-1.
71Ibid., 12-13.
72Ibid., 12-2.
73Ibid., 12-5.
74Currie and Coughlan, “Electronic Evidence in Canada”, 291. 75 Underwood and Penner, Electronic Evidence in Canada, 12-8. 76 Saturley v. World CIBC Markets [2012] NSSC 226.
77Ibid., 16.
78Underwood and Penner, Electronic Evidence in Canada, 12-9.
79Ibid.,12-9.
80Currie and Coughlan, “Electronic Evidence in Canada”, 291. In the Saturley ruling data that was automatically generated by software to register investment trading transactions was determined to be real evidence. It was not a ‘document’ and not therefore subject to the “presumption of reliability designed to satisfy the best evidence rule”.
81Underwood and Penner, Electronic Evidence in Canada, 12-15.
82Ibid., 12-10.
83Ibid., 12-13 to 12-14.
84Ibid., 12-16. This three-way distinction was initially proposed in the US Aguilar ruling, drawing on US Sedona Principles and Maryland Protocol for electronic discovery purposes, and has been extended by Underwood and Penner to apply to ESI for evidence purposes.
85Ibid., 12-17 to 12-18.
86Ibid., 12-19.
87Ibid., 12-19.
88Ibid., 12-20.
89According to the Canlii.org database, 288 cases have mentioned “electronic evidence”, “computer evidence” or “metadata” in the case rulings since 1979. Ninety-one percent of the cases have been within the last ten years (search conducted on August 30th, 2017).
90McGarry v. Co-operators Life Insurance Co, [2011] BCCA 214.
91RS II Productions Inc. v. B.C. Trade Development Corp [2000] BCCA 674 at para. 25 cites numerous rulings and quotes R. v. Starr, [2000] 2 SCR 144, 2000 SCC 40 “the Court has advanced towards what it calls a “principled approach” to the hearsay rule and instructed courts to consider necessity and reliability in each particular case, as opposed to applying “ossified rules” of evidence.”
92McGarry, BCCA 214 at para. 67.
93Canada Evidence Act, R.S.C., c. C-5, s.42 (1985).
94Manitoba Evidence Act, R.S.M., c. E150 (1987); Saskatchewan Evidence Act, R.S.S., c. S-16, ss. 55-56 (1978); Alberta Evidence Act, R.S.A. c. A-18, s. 41 (1980); and Ontario Evidence Act, R.S.O., c. E-23, s. 34.1 (1990).
95McGarry, BCCA 214 at para. 73-77.
96Ibid., at 81.
97Cases citing McGarry: Swiss Reinsurance Company v. Camarin Limited, [2015] BCCA 466 at para. 67 Winstanley v. Winstanley, [2017] BCCA 265 at para. 64.
98Scott Owen’s SSHIRC-funded study on the best evidence rule highlighted the rule as an ineffectual mechanism of admissibility in “Best Evidence in Canada: An Analysis of Issues and Case Law” (project paper, University of British Columbia, 2014).
99Currie and Coughlan, “Electronic Evidence in Canada.” 288.
100Ibid., 285.
101BL. v. Saskatchewan (Social Services) [2012] SKCA 38 at para. 41.
102R. v. Hamdan, [2017] BCSC 676 at para. 82.
103Alan W Bryant, Sidney N. Lederman, Michelle K. Fuerst, and John Sopinka, The Law of Evidence in Canada (3d) ed. (Markham: LexisNexis, 2009), 1225.
104David M. Paciocco, recent appointee as Justice on Court of Appeal for Ontario, is “considered on the Canada’s foremost experts on the laws of evidence”. The Supreme Court of Canada has referenced work he authored or contributed to in 60 cases. A primary contribution is the promotion of principled and workable legal rules, including the principled exception to the hearsay rule. “Backgrounder: The  Honourable Justice David M. Paciocco’s Questionnaire,” Department of Justice, Government of Canada, last modified on April 07, 2017, https://www.canada.ca/en/department-justice/news/2017/04/the_honourable_justicedavidmpacioccosquestionnaire.html.
105D.M. Paciocco, “Proof and progress: Coping with the law of evidence in a technological age,” Canadian Journal of Law and Technology 11 no.2 (2015): 226-227.
106Paciocco, “Proof and progress”, 197.
107R. v.C.L., [2017] ONSC 3583 at para. 21.
108R. v.Hirsch, [2017] SKCA 14 at para.13.
109R. v.Hirsch, [2017] SKCA 14 at para. 21.
110R. v. Hirsch, [2017] SKCA 14 at para. 18.
11131.5 For the purpose of determining under any rule of law whether an electronic document is admissible, evidence may be presented in respect of any standard, procedure, usage or practice concerning the manner in which electronic documents are to be recorded or stored, having regard to the type of business, enterprise or endeavor that used, recorded or stored the electronic document and the nature and purpose of the electronic document. Canada Evidence Act, R.S.C., c. C-5, s.31.5 (1985).
112R. v. Oler, [2014] ABPC 130.
113The recently revised CAN/CGSB-72.34-2017 Electronic records as documentary evidence is now freely available at http://publications.gc.ca/collections/collection_2017/ongc-cgsb/P29-072-034-2017-eng.pdf.
114R. v.Oler, [2014] ABPC 130 at para. 31.
115Desgagne v. Yuen et al [2006] BCSC 955.
116Desgagne v. Yuen et al [2006] BCSC 955 at para. 29.
117Ibid., at para 29.
118Ibid., 35, 45.
119Animal Welfare International Inc. v. W3 International Media Ltd., [2013] BCSC 2193.
120Ibid., 53.
121Ibid., 53.
122Animal Welfare International Inc. v. W3 International Media Ltd., [2014] BCSC 1839 at para.340.
123Ibid., 340.
124Ibid., 346-350.
125Saturley v. CIBC World Markets Inc, [2012] NSSC 226 at para.22.
126R. v. Bernard, [2016] NSSC 358 at para.45-58.
127ARMA International, “Information Governance Maturity Model,” (2013).

Privacy-Driven RIM in Ontario’s Universities

by Carolyn Heald, Queen’s University

 

In a recent issue of this journal, Shan Jin, Records Analyst at Queen’s University in Kingston, Ontario, published her findings on records management practices in Canadian universities based on interviews with twenty-one institutions, the majority of which were located in Ontario. At the time of her initial research in 2014, none of the six Ontario universities with fewer than 10,000 students had a formal records management program; only one of the four universities in the 10-20,000 student category, and three of the five universities in the 20-30,000 student category had a program; all five of the universities with over 30,000 students had a formal records management program1.

“From early days,” Jin wrote, “university archival programs often assumed responsibility for records management2.” However, one key observation from her research is that newer records management programs—those established within the last decade or so—are no longer being situated within archival programs which tend to be considered academic or scholarly units, often part of the library. Rather, they are being placed squarely within the senior administration, typically within a unit dealing with university governance3. Jin pointed out that privacy legislation has had an impact on the establishment of records management programs in Ontario’s universities: four out of the nine programs were launched after universities became subject to the provincial Freedom of Information and Protection of Privacy Act (FIPPA) in June 2006. One of those programs was launched after FIPPA was amended in 2014 to include specific recordkeeping requirements4. Since Jin completed her study, even more universities in Ontario have established records management programs as a direct result of those recordkeeping amendments.

Privacy is a key driver of records management within Ontario’s universities, FIPPA in particular, but also health care privacy legislation, anti-spam legislation, and more recently, the impending impact of international privacy regulations. In my view, the pairing of the two functions is a good move:  it brings a broader perspective to the management of active university information and records, offers wider capacity to navigate the challenges universities face in the increasingly digital world, and positions these information professionals to be key players in their universities’ information governance.

 

FIPPA and the Recordkeeping Amendments

Much of what is driving records management in Ontario’s universities are the 2014 amendments to FIPPA—the so- called recordkeeping amendments—that took effect on January 1, 20165. These new clauses were added as a direct result of the gas plants scandal that plagued former premier Dalton McGuinty’s government and which resulted in a criminal conviction for his Chief of Staff for illegally wiping government computers6.

McGuinty had decided to cancel construction of two natural gas power plants, resulting in much higher cancellation fees than he had led the public to believe. The issue was subject to review by a standing committee of the legislature, and in conducting its review, the committee requested emails from political staff in the office of the responsible Minister of Energy. The office did not comply, and it was learned that political staff made a habit of routinely deleting emails. A complaint was made to Ontario’s Information and Privacy Commissioner who launched an investigation where it was learned further that the Secretary of Cabinet had been approached by the Premier’s Chief of Staff seeking information on how to permanently delete emails and other electronic documents.

In her damning 2013 report, former Commissioner Ann Cavoukian remarked that she had “trouble accepting that this practice [i.e., routinely deleting emails] was simply part of a benign attempt to efficiently manage one’s email accounts7.” Rather, she said, “I became very concerned with the apparent lack of responsibility and accountability for records management practices within the offices of senior political leaders in this province.”8 She made several recommendations to the Ministry of Government Services and the Premier’s Office for reviewing records retention schedules, developing policies, and ensuring staff training. She also made some key recommendations for amending FIPPA9.

These amendments are framed in terms of a duty to preserve records in order to ensure a right of access. Records managers often deal with the opposite problem—over-retention—and seek to establish retention schedules so that organizations can get rid of records. Nevertheless, a perceptive records manager will recognize the amendments for what they truly are: a requirement to have in place a well- functioning records management program.

For universities, this is revolutionary. For the first time, Ontario’s universities have a legislated mandate to implement records management. Universities have never been subject to Ontario’s archives and recordkeeping legislation which applies to most other institutions covered by FIPPA (namely government ministries and a number of agencies). Now all FIPPA institutions, including government ministries, agencies, universities, hospitals, as well as municipalities and local municipal bodies covered under the Municipal Freedom of Information and Protection of Privacy Act, are required by law to follow established records retention policies and rules.

This is not to say that universities have not been subject to recordkeeping rules in the past. Like any corporate body, they must abide by laws pertaining to employment, tax, occupational health and safety, and any number of other statutes, some of which incorporate rules on the creation and keeping of records10. But to have a clear directive that universities must have in place recordkeeping measures is truly radical and those without such a program have been stepping up to ensure compliance.

The IPC’s awareness of, and engagement with, records management has been evolving ever since October 2005 when former Commissioner Cavoukian was alerted by a Toronto Star reporter that recycled patient records from a local ultrasound and x-ray clinic were swirling around downtown Toronto streets, being used as a prop for a movie shoot about 9/11. The Commissioner’s investigation and report, her first under the newly implemented Personal Health Information Protection Act, concluded amongst other things, that recycling personal health information did not constitute secure disposal as required by the Act11. This unfortunate incident first made the IPC connect the dots between information management and privacy, an understanding that has only continued to grow12.

 

Beyond FIPPA

FIPPA has been the key driver towards the increase in the number of records management programs in Ontario’s universities. However, it is not the only piece of privacy legislation that is having an impact on recordkeeping. Many universities are also subject to health care privacy to one degree or another.

Ontario’s Personal Health Information Protection Act (PHIPA) was enacted in 2004 to address the handling of personal health information by health information custodians. Custodians are entities that provide health care services, and in the context of a university, that could be a student health clinic, mental health counselling, athletic injury clinic, or any number of health care services offered to the broader community such as psychological assessments or physiotherapy.

PHIPA’s privacy requirements are much more stringent given the sensitive nature of the information involved, and these records must be managed properly, and retained and disposed of in accordance with appropriate records retention requirements13. Although PHIPA has not been amended to include a specific recordkeeping mandate, the IPC nevertheless has been emphasizing the importance of good records management with respect to patient records14.

Another statute with significant recordkeeping requirements is Canada’s Anti-Spam Legislation, or CASL15. This legislation came into effect on July 1, 2014 with the intention of enhancing trust in Canada’s electronic commerce activities primarily by reducing the sending of email SPAM, or commercial electronic messages (CEMs) as defined in the Act. While not a privacy statute per se, CASL is similar to privacy in that it requires an organization or individual to obtain consent—express or implied—before sending a CEM. Individuals can withdraw their consent and accordingly all CEMs must have an unsubscribe mechanism. Penalties for non-compliance can be as high as $10 million and the federal government has proven that it is not averse to levying significant fines16.

Most universities in Canada consider activities falling within their core educational mandate to be non- commercial, even if money passes hands (such as a student paying for tuition). This approach has not been tested, but the government’s investigations to date have focused on private sector bodies that clearly have a commercial basis, rather than public sector bodies. Nevertheless, the legislation includes a private right of action allowing individuals to sue senders of CEMs for alleged CASL violations. This private right of action was supposed to have taken effect on July 1, 2017, but due to numerous concerns expressed about its implications, it has been postponed indefinitely pending further study17

CASL brings with it the requirement to create and keep detailed records primarily for the purpose of tracking consents so that senders of CEMs can defend themselves in the event they are challenged with non-compliance. In 2016, the government issued an enforcement advisory notice on how to keep records of consent and the types of records expected to be kept, including “all evidence of express and implied consent (e.g. audio recordings, copies of signed consent forms, completed electronic forms) from consumers who agree to receive CEMs, documented methods through which consent was collected, policies and procedures regarding CASL compliance, and all unsubscribe requests and resulting actions18.” The House of Commons Standing Committee on Industry, Trade and Technology undertook a statutory review of CASL in 2017 and made a number of recommendations seeking clarification of many of its provisions. Ontario’s universities are amongst numerous organizations looking forward to the outcome.

Looming on the horizon is the European Union’s General Data Protection Regulation (GDPR) which will take effect on May 25, 201819. GDPR will replace the EU’s 1995 Data Protection Directive which has become out of date given the rapid technological changes experienced over the past twenty years. The GDPR applies to EU residents and significantly expands their personal privacy rights. Not every university in Ontario will be affected by the GDPR to the same extent, but likely most universities will have to grapple with its implications since it applies to entities that (a) have operations or employees in the EU, (b) offer goods or services to EU residents, or (c) monitor behaviour of EU residents, such as through online tracking. As with CASL, managing consents will be paramount because penalties for non- compliance can be as much as €20 million or 4% of global turnover.  As with other privacy legislation, the proper management of personal information will be critical in order to ensure it is collected, used, disclosed, retained and disposed of in compliance with the legislation.

The privacy environment in which universities operate is becoming increasingly complex, prompting a concomitant records management imperative. Privacy depends on good information-handling, a fact being articulated with conviction by more and more privacy regulators.

 

Privacy and Records Management Challenges in Universities

Universities are multifaceted and unwieldy organizations. Each university is an autonomous body with its own culture but all universities have in common a few key factors: they have a diverse and largely uncontrolled clientele and workforce; they foster innovation, risk-taking, and pushing boundaries; and they are open and collaborative by design. The interplay of these three factors, especially within an increasingly digital world, makes establishing rules and limits around the handling of records and information a challenging task.

One indicator of this wild frontier is the range of devices that connect to a university’s IT infrastructure. While many organizations are only now developing their BYOD (bring your own device) policies, universities have been dealing with a BYOD environment for years. Far from studying the issue first, assessing the risks and developing policies and procedures to ensure a smooth implementation, universities have been running to keep up with the new devices that arrive on their campuses every term. Students are not restricted in what type of laptop, tablet, cellphone, or indeed any device, they bring to class and university IT departments have typically focused on modifying and upgrading their infrastructure to support this constantly changing environment. Faculty members, too, need to access the Internet and their own electronic files from many locations, on campus and off, frequently in far- flung locales, and they, too, want to use devices of their own choosing.

Not all devices are secure, and it can be difficult conveying the message that devices should be encrypted at the very least, and not used for sensitive personal information. When it comes to records management, it can be a struggle to ensure that university records are filed in a proper recordkeeping system, and that devices are wiped clean of official records when they are no longer being used by employees. If a device is stolen or lost, individuals are usually more concerned with the cost of replacement and the loss of their intellectual property rather than having caused a privacy breach or losing official records.

When it comes to innovation, universities are expected to challenge the status quo, explore new ideas, interrogate conventional wisdom, and tackle unpopular ideas. In many workplaces, the Internet can be locked down to prevent access to undesirable websites (or surfing on staff time), but no such restrictions are imposed or tolerated in a university environment. A researcher may be studying the pornography industry, or religious intolerance in the Middle East, or fake news in the United States, or any number of controversial topics. Unrestricted access to the Internet and other resources is required. New thinking and risk-taking are the order of the day, bringing the potential for connecting to unvetted and insecure external resources.

In a similar vein, universities pride themselves on being collegial, collaborative and open. Faculty members and often students as well need to engage with the national and international community of scholars, and this can happen through technology or through in-person engagements in a multitude of uncontrolled environments. The need to share documents and resources often results in the use of cloud-based tools that may or may not be properly vetted for privacy and security.

Indeed the rapid adoption of cloud computing tools has been occurring in Ontario’s universities for several years. Not only are such tools essential for research collaboration, but they are also being used for a myriad of classroom purposes to engage students and allow greater collaboration. From large- scale learning management systems to classroom clickers, online textbooks to exam-writing websites, students are constantly being directed to third-party sites and tools. Technology-enhanced learning will only continue as we expect to see the curriculum enabled through artificial intelligence, the Internet of Things, social media, gamification and virtual reality20.

But cloud tools are not just invading the classroom; they are being used for numerous other kinds of student services as well as administrative functions. There are apps and websites for pre-ordering meals in the cafeteria, reserving tickets to campus events, registering children in summer camps, fostering alumni engagement, managing student conduct, and on and on. There is simply no end to the ingenuity of software and app developers.

The upshot of all of this cloud engagement is that universities have placed their trust in commercial third parties to handle significant amounts of personal information and other confidential and sensitive data, and they continue to do so as the cloud computing environment evolves and transforms around them.

FIPPA requires universities to ensure that personal information is not collected, used, disclosed, retained or disposed of improperly and so it is imperative that universities review cloud engagements and negotiate appropriate terms before such tools are used.

In my opinion, the most baleful effect of cloud tools is only just beginning to be recognized. Within the last few years, the business model for cloud service providers has shifted: where in the past, vendors made money from selling their products, now they seek to monetize the personal information they collect from users. It is a common refrain that when it comes to cloud, you are the product, meaning that what companies really want is their users’ personal information so that it can be shared with advertising networks where companies pay to place their ads that will pop up on users’ browsers or in their social network feeds21.

Gathering people’s personal information relies on tracking users as they browse the Internet and set up accounts on various digital platforms. The companies providing the services and apps are not doing anything illegal since they describe their information practices in their terms of service and privacy policies. But such policies are lengthy and convoluted, and vendors rely on users to consent to them without reading them22. Vendors also engage in definitional gymnastics to confuse those few users who bother to read the terms:

Terms of service or end-user licensing agreements are designed to be convoluted legal documents many pages long to discourage user understanding. The interconnected nature of technology means that many devices, services, or websites transfer data with third-party companies, all of which have their own user agreements23.

Companies will claim that they are not collecting “personal information” because the data they track is not name-identified; however, collecting information about devices and browsing history for the purposes of tracking and sending people targeted advertisement based on predictive analytics clearly demonstrates that they can identify individual users, even if they do not have a name. Privacy regulators believe such data collection constitutes the collection of personal information. The federal privacy commissioner has expressed this view in a policy position on online behavioural advertising (OBA):

Taking a broad, contextual view of the definition of personal information, the OPC [Office of the Privacy Commissioner] will generally consider information collected for the purpose of OBA to be personal information, given: the fact that the purpose behind collecting information is to create profiles of individuals that in turn permit the serving of targeted ads; the powerful means available for gathering and analyzing disparate bits of data and the serious possibility of identifying affected individuals; and the potentially highly personalized nature of the resulting advertising24.

The UK’s Information Commissioner has also provided guidance on the new General Data Protection Regulation and points out that the definition of “personal data” has expanded:

The GDPR applies to ‘personal data’ meaning any information relating to an identifiable person who can be directly or indirectly identified in particular by reference to an identifier.

This definition provides for a wide range of personal identifiers to constitute personal data, including name, identification number, location data or online identifier, reflecting changes in technology and the way organisations collect information about people25.

The privacy invasiveness and power imbalance inherent in this “surveillance capitalism”26 is concerning at a societal level; however, at the more mundane level of the university privacy officer, such activity constitutes secondary use under FIPPA which requires consent. A university could simply inform its users that their personal information is being collected by the third party for this secondary purpose and ask them to agree if they wish to set up an account or download an app. However, if use of the service or app is mandatory, then consent is not appropriate and the only recourse, apart from not engaging with the vendor, is to negotiate an agreement to ensure there is no secondary use of personal information.

The privacy challenges of using cloud tools in the classroom have been gaining attention as school boards and educational authorities recognize the dangers inherent in tracking. Ontario’s Information and Privacy Commissioner has issued guidance to school boards at the K-12 level27. Similar guidance for the US higher education sector has been issued by EDUCAUSE, an association for IT professionals within the post-secondary sector28.

Cloud computing also brings with it traditional records management challenges. When engaging a cloud vendor, frequently there is little if any consideration given to appropriate recordkeeping practices.  Is the vendor expected to retain the university’s official record on its servers? Will the vendor assist the university in implementing the authorized records retention schedules? Can the university get a copy of the record in a readable format when the engagement is terminated? Will the vendor permanently delete its copy of the record and any backups? These are issues that are often neglected when negotiating a contract with a vendor, yet it is important to consider them because at some point, the engagement will end.

Most Ontario universities have outsourced their email and office environment to either Microsoft or Google. Initially, it was student email that was outsourced, but it did not take long for university administrations to decide that the whole office environment could be handled by third-party service providers more efficiently. This decision was made, for the most part, without thought to records management. These tools may have abundant functionality for creating, sharing, and collaborating on documents, but they have little records management functionality as the records management profession would understand it29.

 

Opportunities for Information Governance

The privacy and records management challenges facing universities today highlight the value in having records management and privacy closely involved. Records management and privacy are complementary functions, each informing the other. Where a records management unit is situated is not important in itself; the success of a program depends more on the individuals involved—their grasp of the challenges, their ability to communicate those challenges and their ability to cooperate with others—than on where the program resides within the organizational hierarchy. Jin is absolutely correct in saying that “records managers must capitalize on the advantages and overcome the disadvantages of [a university’s] organizational structure in order to seek ways to improve records management services. It is important to align efforts from the records management program with other strategic partners such as archives, Information Technology (IT) security, legal department, privacy and compliance office, etc30.”

And yet, being co-located with the privacy function may open up more opportunities for records managers to participate at the information governance (IG) table, in alliance with other, better-understood functions, such as IT security. The inclusion of privacy professionals in the IG conversation is a given due to legislation and the well-understood risks inherent in a privacy breach, but the records manager is often forgotten because the negative consequences of financial and operational inefficiencies resulting from unmanaged information are not obvious until a crisis occurs. However, the records management perspective is broader than privacy, and more complementary to the IT security professional’s perspective than one might imagine.

The IT security professional focuses on the trilogy of CIA: confidentiality, integrity, and availability. The privacy professional’s domain is confidentiality of personal information and availability of records for the purposes of fulfilling an access to information request. The records manager’s focus encompasses a wider understanding of availability and confidentiality, and significantly adds integrity31. For the records manager, integrity is one of the characteristics of an authoritative record as defined in the ISO Records Management standard32. The integrity of an electronic records system is the Canadian proof standard for submitting an electronic record as legal evidence in a court of law33. The records manager can speak to CIA more broadly than the privacy professional.

The pairing of records management with privacy may be a fairly recent phenomenon for Ontario’s universities; however, Christine Ardern rightly points out that privacy and records management have both been incorporated within the broader information governance field at least since the advent of electronic records whose volume of personal information and ease of access highlighted the potential privacy risk. This realization led to the publication in 1980 of the OECD Guidelines on the Protection of Privacy and Trans-border Flows of Personal Data and drove many countries to develop their own data protection legislation34. In her article on the evolution of IG, Ardern cites a number of definitions of information governance, but I like the definition provided from the Sedona Conference Journal:

(Information Governance) means an organization’s coordinated, inter-disciplinary approach to satisfying information compliance requirements and managing information risks while optimizing information value35.

The focus on a coordinated and interdisciplinary approach calls out the reality that managing an organization’s information asset is not the responsibility of one kind of information professional with one perspective. FIPPA’s recordkeeping requirements have been the catalyst for many of Ontario’s universities to adopt a records management/privacy nexus. Perhaps the momentum can continue to drive these information professionals, in cooperation with their archives and IT colleagues, to adopt a more encompassing and interdisciplinary information governance framework.

 

 

1Shan Jin, “Records Management in Canadian Universities: The Present and the Future,” Sagesse (Winter 2017), p. 2, http://armacanada.org/images/Sagesse_Winter_2017/3._RM_in_Canadian_Universities-
The_Present      Future.pdf (accessed 27 June 2017). Jin’s study of 21 universities represents approximately 20% of universities in Canada.
2Ibid., p. 1.
3Ibid., p. 4.
4Ibid., pp. 10-11.
5Information and Privacy Commissioner of Ontario, FIPPA and MFIPPA: Bill 8 – The Recordkeeping Amendments (December 2015).
6Karen Howlett, “McGuinty’s former aide Livingston found guilty of destroying documents about gas plant cancellations,” Globe and Mail (19 January 2018), https://www.theglobeandmail.com/news/national/mcguinty- aide-found-guilty-of-destroying-documents-in-gas-plant-trial/article37669957/ (accessed 20 January 2018).
7Information and Privacy Commissioner of Ontario, Deleting Accountability: Records Management Practices of Political Staff (June 5, 2013), p. 2.
8Ibid., p. 1. Numerous media stories have been published about the court case against David Livingston, the Premier’s Chief of Staff, and the deputy Chief of Staff, Laura Miller. For a story with a highly salient records management focus, see Karen Howlett, “Top McGuinty aide wasn’t trained in record keeping, court hears,” Globe and Mail (24 October 2017), https://www.theglobeandmail.com/news/national/top-mcguinty-aide-wasnt-trained-in-record-keeping-court-hears/article36710297/ (accessed 3 January 2018).
9IPC, Deleting Accountability, pp. 2-3. Note that the recommendation to create a duty to document key decisions was not included in the legislative amendments.
10See, for example, the Employment Standards Act, the Occupational Health and Safety Act, and the federal Income Tax Act. Even FIPPA itself has a requirement to retain personal information for a minimum of one year. 11 Information and Privacy Commissioner of Ontario, PHIPA Order HO-001 (October 2005).
12IPC staff regularly make presentations about privacy and records and information management; see presentations on www.ipc.on.ca. As well, the IPC issued a guidance document entitled Improving Access and Privacy with Records and Information Management (November 2016).
13In Ontario, patient information is generally retained for ten years, or ten years after the individual reaches his or her eighteenth birthday. This standard retention period is reflected in the various acts governing the regulated health professionals.
14See the IPC website (www.ipc.on.ca) for various presentations and guidance documents.
15The legislation has become known as CASL because its full title is An Act to promote the efficiency and adaptability of the Canadian economy by regulating certain activities that discourage reliance on electronic means of carrying out commercial activities, and to amend the Canadian Radio-television and Telecommunications Commission Act, the Competition Act, the Personal Information Protection and Electronic Documents Act and the Telecommunications Act. In December 2017, the House of Commons Standing Committee on Industry, Science and Technology published its report concluding its statutory review of the legislation. The first of the report’s 13 recommendations is to adopt a short title for the Act, the suggestion being the Electronic Commerce Protection Act (ECPA).  See Canada, House of Commons, Canada’s Anti-Spam Legislation:  Clarifications Are in Order, Report of  the Standing Committee on Industry, Science and Technology (December 2017).
16Findings have included: Porter Airlines, $150,000; Rogers Media, $200,000; PlentyofFish, $48,000. See https://crtc.gc.ca/eng/com500/ut2015.htm (accessed 3 January 2018). A 2015 decision against CompuFinder with a penalty of $1.1 million was later reduced to $200,000; see http://www.slaw.ca/2017/11/15/crtc-compufinder- decision-lowers-casl-spam-penalty/ (accessed 3 January 2018).
17See the Government’s announcement: https://www.canada.ca/en/innovation-science-economic- development/news/2017/06/government_of_canadasuspendslawsuitprovisioninanti-spamlegislati.html (accessed 3 January 2018).
18The enforcement advisory was issued on July 27, 2016: https://www.canada.ca/en/radio-television- telecommunications/news/2016/07/enforcement-advisory-notice-for-businesses-and-individuals-on-how-to-keep- records-of-consent.html (accessed 3 January 2018).
19See https://www.eugdpr.org/ (accessed 3 January 2018).
20There are numerous articles about disruptive technologies in higher education. See, for example, David Wheeler, “Technology and the Imminent Disruption of Higher Education: Is Fear the Path to the Dark Side?” Academica Forum (13 January 2016), https://forum.academica.ca/forum/technology-and-the-imminent- disruption-of-higher-education-is-fear-the-path-to-the-dark-side (accessed 4 January 2018). See also Leigh M. and Thomas Goldrick, “The top 5 disruptive technologies in higher ed,” eCampus News (June 5, 2017), https://www.ecampusnews.com/disruptions-and-innovations/disruptive-technologies-higher-ed/ 
21There is no shortage of articles available on the Internet explaining how ad networks work to those who wish to take advantage of them. Two interesting books with a decidedly cautionary perspective are Tim Wu, The Attention Merchants: The Epic Scramble to Get Inside Our Heads (New York: Alfred A. Knopf, 2016) and Bob Hoffman, BadMen: How Advertising Went from a Minor Annoyance to a Major Menace (Bob Hoffman, 2017).
22Alex Hern, “I read all the small print on the Internet and it made me want to die,” The Guardian (15 June 2015), https://www.theguardian.com/technology/2015/jun/15/i-read-all-the-small-print-on-the-internet (accessed 26 January 2018).
23“Don’t Buy What Surveillance Capitalism is Selling,” Smith Business Insight (15 November 2017), https://smith.queensu.ca/insight/articles/don_t_buy_what_surveillance_capitalism_is_selling (accessed 26
24Office of the Privacy Commissioner, “Policy Position on Online Behavioural Advertising” (December 2015), https://www.priv.gc.ca/en/privacy-topics/advertising-and-marketing/behaviouraltargeted- advertising/bg_ba_1206/ (accessed 4 January 2018).
25United Kingdom, Information Commissioner’s Office, https://ico.org.uk/for-organisations/guide-to-the-general- data-protection-regulation-gdpr/key-definitions/ (accessed 4 January 2018).
26The term was coined by Shoshana Zuboff; see https://en.wikipedia.org/wiki/Surveillance_capitalism (accessed 26 January 2018).
27See Information and Privacy Commissioner, Online Educational Services: What Educators Need to Know (November 2016) https://www.ipc.on.ca/wp-content/uploads/2016/11/online-educational-services.pdf (accessed 26 January 2018).
28See information on learning data privacy principles on the EDUCAUSE website: https://library.educause.edu/resources/2016/12/learning-data-privacy-principles-and-recommended-practices- webinar (accessed 26 January 2018).
29For commentary on the records management challenges inherent in the Office 365 environment, see “How Office 365 challenges traditional records management practices” (27 September 2016), https://andrewwarland.wordpress.com/2016/09/27/how-office-365-challenges-traditional-records-management- practices/ (accessed 22 September 2017) and “Top 5 Office 365 Problems for Information Managers” (11 September 2017), http://cannonspark.ca/blog.html (accessed 22 September 2017).
30Jin, p. 5.
31For an interesting look at the complementarity perspectives of privacy and information security see Ken Mortensen, “CPO to CISO: Four Steps for Privacy Professionals to Get Security Savvy,” CPO Magazine (30 October 2017), https://www.cpomagazine.com/2017/10/30/cpo-ciso-four-steps-privacy-professionals-get-security-savvy/ (accessed 20 December 2017). The article does not speak to records management.
32International Organization for Standardization, 15489-1:2016, Records Management (ISO, 2016).
33Canadian General Standards Board, CAN/CGSB-72.34-2017, Electronic Records as Documentary Evidence (CGSB, March 2017).
34Christine Ardern, “From Records Management to Information Governance: A Look Back at The Evolution,”
Sagesse (Spring, 2016), p. 12, https://www.armacanada.org/index.php/canadian-rim/spring-2016-publication
(accessed 27 June 2017).
35Ibid., p. 4

2017 Edition

Welcome to the Winter 2017 Sagesse Publication!

1. Introduction – Sagesse Winter 2017 (384 KB)

by Uta Fox, CRM, ARMA Canada, Director of Canadian Content

2. Memory as a Records Management System (471 KB)

by Sandra Dunkin, MLIS, CRM, IGP and Cheri Rauser, MLIS

3. RM in Canadian Universities – The Present & Future (329 KB)

by Shan Jin, MLIS, CRM, CIP

4. Electronic Recordkeeping – From Promise to Fulfillment (370 KB)

by Bruce Miller, IGP, MBA

5. D’archivage électronique – De la promesse à l’accomplissement (439 KB)

par Bruce Miller, IGP, MBA

6. From Chaos to Order – A Case Study In Restructuring a Shared Drive (957 KB)

by Anne Rathbone, CRM and Kris Boutilier

Meet the Authors

Sandra DunkinMLIS, IGP, CRM, is the Records & Information Management Coordinator for the First Nations Summit Society.  Sandra is also currently the Program Director for the ARMA Canada Conference, Chair of ARMA Vancouver’s First Nations RIM Symposium Committee, and a member of ARMA International’s Core Competencies Update Group.

 Currently working as an academic librarian in online distance education Cheri Rauser, MLIS, enjoyed exploring the current state of cognitive informatics and neuroscience while collaborating on this article. Besides working in academic librarianship Cheri has been employed as a university lecturer, museum cataloguer and moving image archivist. Other research interests include: the role of the library in accreditation and web-based indexing vs paper indexing.

Shan Jin, MLIS, CRM, CIP,  is a Records Analyst/Archivist at Queen’s University Archive. She earned a master degree of library and information studies from Dalhousie University, is a Certified Records Manager and a Certified Information Professional. She has contributed to several ARMA technical reports. Shan can be contacted at jins@queensu.ca.

 Bruce Miller, IGP, MBA is President of RIMtech, a vendor-neutral records technology consulting firm. He is an author, an educator, and the inventor of modern electronic recordkeeping software. The author of “Managing Records in Microsoft SharePoint”, he specializes in the deployment of Electronic Document and Records Management Systems.

With a twenty-two year career in Local Government IT, Kris Boutilier has overseen numerous reinventions of technology. From DOS 3.2 to Windows 10, 300 baud dial-up to 100Mbit broadband, the IBM Selectric to Office 365. Transitioning from Physical to Electronic Records Management has proved the most challenging undertaking yet. Contact Kris at Kris.Boutilier@scrd.ca.

Anne Rathbone, CRM,  has 20 years RIM experience, all with local governments.  She was one of the leaders on the shared drive project, provides all staff training on the new shared drive and is responsible for maintaining the integrity of the new framework. She echos Kris’ sentiments about e-records.  Contact Anne at Anne.Rathbone@scrd.ca.

Sagesse: Journal of Canadian Records and Information Management an ARMA Canada Publication Winter, 2017 Volume II, Issue I

 

Introduction

 Welcome to our second issue of Sagesse: Journal of Canadian Records and Information Management and an ARMA Canada publication!

In March 2016, ARMA Canada launched its first issue of this publication under the working title of Canadian RIM, an ARMA Canada Publication. At the same time we announced a contest to get suggestions for a title from the ARMA Canada membership that focused on Canadian records and information management and information governance. Deidre Brocklehurst, from Surrey, British Columbia, suggested Sagesse: Journal of Canadian Records and Information Management which was the top choice for the Canadian Content Committee, which is now called Sagesse’s Editorial Review committee (the committee) and we congratulate Deidre for such an appropriate title.

Sagesse (pronounced “sa-jess”) is a French word meaning wisdom, good sense, foresight and sagacity which is most appropriate for the mandate of ARMA Canada’s publication. It embodies Canada’s unique heritage as well as instilling knowledge, wisdom and common sense.

Sagesse’s Issue I in 2017 features the following articles:

 

  • “Memory as a Records Management System,” by Sandra Dunkin and Cheri Rauser highlights how our brains process, organize and retrieve information through memory and recall patterns and garner that into a records management system. This is indeed a unique approach to records management.
  • Shan Jin presents an interesting and thorough discussion on records management in Canadian universities in her article entitled, “Records Management in Canadian Universities: the Present and the Future.” She provides a comprehensive view on current records management practices in Canadian
  • In “From Promise to Fulfillment: The 30-Year Journey of Electronic Recordkeeping Technology,” Bruce Miller shares his intriguing personal journey in the development of electronic recordkeeping software technology. This article is also translated into French.
  • And, Anne Rathbone and Kris Boutilier provide a case study on the Sunshine Coast Regional District in British Columbia and their enticing and ambitious undertaking of restructuring a shared drive used by all employees in their article, “From Chaos to order – A Case Study in Restructuring a Shared Drive.”

 

We’d also like you to be aware that there is a disclaimer at the end of this Introduction which notes that the opinions expressed by the authors are not the opinions of ARMA Canada or the committee. If after reading any of the papers you find that you are in agreement or have other thoughts about the content, we would certainly like to hear them and urge you to share with us. We’ll try to publish your reflections in our next Issue. And if you have any recommendations about our publication please share these as well. Opinions and comments should be forwarded to: armacanadacancondirector@gmail.com.

What goes into putting this type of publication together? First of all, we need you and your RIM-IG experiences! Then, Sagesse’s volunteer Editorial Review committee is on hand to assist you. We have received some amazingly unique articles for publication and we applaud our authors for their dedication to our Canadian industry.

It takes time to prepare each edition, from the point when we approach authors to when we actually are able to publish an article. Information about the types of articles we are interested in and the process through which the articles go is available on ARMA Canada’s website – www.armacanada.org – see Sagesse.

One other item I would like to draw your attention to is our shamelessly promoting a session at ARMA Canada’s upcoming conference in Toronto, ON, in 2017. Two of Sagesse’s Editorial Review committee members, Christine Ardern and John Bolton, will deliver a presentation on writing for Sagesse – something we encourage each of you to pursue.

 

Enjoy!

ARMA Canada’s Sagesse’s Editorial Review Committee:

Christine Ardern, CRM, FAI John Bolton

Alexandra (Sandie) Bradley, CRM, FAI

Uta Fox, CRM, Director of Canadian Content Stuart Rennie

 

DISCLAIMER

The contents of material published on the ARMA Canada website are for general information purposes only and are not intended to provide legal advice or opinion of any kind. The contents of this publication should not be relied upon. The contents of this publication should not be seen as a substitute for obtaining competent legal counsel or advice or other professional advice. If legal advice or counsel or other professional advice is required, the services of a competent professional person should be sought.

While ARMA Canada has made reasonable efforts to ensure that the contents of this publication are accurate, ARMA Canada does not warrant or guarantee the accuracy, currency or completeness of the contents of this publication. Opinions of authors of material published on the ARMA Canada website are not an endorsement by ARMA Canada or ARMA International and do not necessarily reflect the opinion or policy of ARMA Canada or ARMA International.

ARMA Canada expressly disclaims all representations, warranties, conditions and endorsements. In no event shall ARMA Canada, its directors, agents, consultants or employees be liable for any loss, damages or costs whatsoever, including (without limiting the generality of the foregoing) any direct, indirect, punitive, special, exemplary or consequential damages arising from, or in connection to, any use of any of the contents of this publication.

Material published on the ARMA Canada website may contain links to other websites. These links to other websites are not under the control of ARMA Canada and are merely provided solely for the convenience of users. ARMA Canada assumes no responsibility or guarantee for the accuracy or legality of material published on these other websites. ARMA Canada does not endorse these other websites or the material published there.

Authors’ Forward to Memory as a Records Management System

 

This paper was originally written in 2000 as a required assignment for the University of British Columbia’s (UBC) School of Library, Archival and Information Studies’ (SLAIS) LIBR 516: Records Management course, taught by Alexandra (Sandie) Bradley. The paper was subsequently published in the ARMA Vancouver Chapter newsletter, VanARMA, Volume 35, Issue 7, February 2004 (see newsletter introduction below).

The authors thoroughly enjoyed the initial collaborative and creative process when first drafting this paper as students and were thrilled to be asked to revisit this topic for Sagesse. The original paper now seems charmingly naïve in parts and certainly dated as we reference ‘palm pilots’ and other outdated technology concepts and limitations of the year 2000. And yet much of the substance of our original thesis is still compelling over the intervening 16-year period of development.

In the years since the paper was first written and published, the authors have been influenced by the many changes occurring in the fields of library science and records management, neuroscience, technology research and development, the documented challenges to cultural biases around collective memory as well as the growth in our own understanding stemming from personal experience and increased professional knowledge as our careers progress. The authors have endeavoured, in this iteration, to update our understanding of these and other topics through the exploration of new research into neuroscience and jurisprudence, as well as including an expanded discussion of collective memory/oral traditions as a valid means of historical/cultural record keeping, with a particular focus on Canadian Aboriginal oral culture resulting from one of the authors’ professional employment experience.

The original paper remains almost in its entirety (with quotes referencing palm pilots and all), however, it has been gently restructured to fit within the context of our new research and expanded thesis development. The original introduction to the paper from the VanARMA publication has also been retained below as it is no longer available in print or online. It has been a joy to revisit this topic and review new research to expand our own understanding of this complex topic.

Sandra Dunkin & Cheri Rauser, Vancouver, BC September 2016

VanARMA Introduction to Memory
as a Records Management System, February 2004:

By: Sandra Dunkin, VanARMA Newsletter Editor

This month features an article on how the human brain processes, organizes, and retrieves information through memory and recall patterns. It outlines many of the basic memory functions the human brain is capable of and compares them, more than favourably, with modern technological devices designed to recreate those very processes artificially.

Many of us spend countless hours in front of our computers, structuring databases, creating classification and retrieval systems for records of all types on a variety of media. What if we could improve upon the document management systems we use daily by having them mimic our natural cognitive processes?

As technology speeds ahead with new, bigger and faster means of storage and retrieval of records, we have a unique opportunity, Muse-like, to inspire software designers, hardware engineers and other assorted computer geeks on how to create better utilities to manage these masses of information. We already have voice recognition capability, but what about making access points to data storage more flexible, more “human”?

The human brain has a long history of storage and retrieval processes with a plethora of access points that can be as humorous and surprising as they are effective. Databases and other software applications are the tools of our profession. Perhaps it is time to consider how we really want/need them to work for us.

Mnemosyne, one of the Titans of Greek mythology, Goddess of Memory and, by Zeus,
mother of the Muses. According to Mary Carruthers (1996),
memory was the most noble aspect of ancient and medieval rhetoric.
Oil painting by Dante Gabriel Rossetti, 1881. Collection of the Delaware Art Museum,
Wilmington. Gift of Samuel and Mary R. Bancroft.

 

Memory as a Records Management System

by Sandra Dunkin, MLIS, CRM, IGP & Cheri Rauser, MLIS

 

Introduction

In the last 15-20 years there has been an exponential increase in the use of mobile technology even though some twenty-first century luddites bemoan our embracing of such technology. Those of us who appreciate the conveniences and rely on it to do our jobs, want to understand how we can develop technology that draws on the human capacity to store and retrieve information using the human brain as a template for future records management systems.

Rather than sound a warning bell of dire consequences if we don’t halt our engagement with mobile technology, the authors’ intention via this work is to highlight and validate the intrinsic value in human brain based oral memory creation and its application in records and information management (RIM). While acknowledging the inherent danger of information overload, the authors will explore the potential of orality in records management endeavours: past, present and future. Further, to highlight the value and the connections between what humans have always done and are working to improve, while still utilizing modern technology, the authors will explore some methods in which oral traditional cultures encode records in memory and discuss some of the ancient functions of oral records managers.

 

The Brain and Records Management

An early response to the increase in mobile technology came from Kate Cambor (1999) who suggested that people were becoming overly reliant on an “accumulating external memory network” of aides-memoire in the form of computers, palm pilots and other storage and retrieval devices (2). Cambor maintained that increasing dependence on such externalised media had been to the detriment of training our neurological filing cabinets (aka memory) to perform their autonomic tasks of classifying and retrieving data. Jim Connelly, CRM (1995) had made similar warnings five years earlier: suggesting that the brain’s capacity for retrieving information is a “common” records management tool that is being largely forgotten and ignored (35).

The authors maintain that records managers and information professionals are in the position of assisting us in managing the information overload that can result from access to far greater amounts of information than ever before. The professional skills and techniques of this group can be harnessed to help mitigate that potential overload through accessing the human brain’s innate capacity to make sense of what appears at first to be nonsensical and disconnected. The authors will show how the technology of the human brain and the methods that humans have employed to develop oral systems to store information (corporate memory) can provide clues for modern records managers to design systems that enhance, rather than work against, the human brain’s capacity for logical storage and retrieval of information (Wang, 2003).

Records and information management (RIM) professionals could, in the near future, plan information systems that take into account the logical mental cues that enhance people’s memories and therefore their ability to retrieve information from both mental storage facilities and computerised storage systems. According to Jim Connelly (1995), information and records managers need to recognise that “memory is . . . the most common information retrieval software known to man [sic]” (35). Understanding how the human brain stores information could enhance our ability to anticipate how the brain strategically files or searches for data within a central records management system, in a library database, an online catalogue, in an encyclopaedia or on the internet.

But, in order to employ the logic of human-filed memory we must first understand how memory is encoded in the brain and the roles that oral memory systems have traditionally played in how individuals and, therefore, societies remember. To that end, the authors will review the concept of oral memory and oral traditions, as well as the science of the brain memory functionality. This context is essential in formulating theories for improvement of modern records management systems – storage, maintenance, retrieval and disposition. The tenacity and perseverance of oral records should inform the paradigm by which we approach modern RIM practise, especially in this age of Big Data and the proliferation of stored records.

 

Oral Traditions and the Written Record

Circa 2500 years ago, in the Phœdrus, Plato asserted that Socrates saw the development of alphabets as crutches that limited the capacity and usefulness of the brain as the central storage system for human knowledge. “Writing, far from assisting memory, implanted forgetfulness into our souls” (Plato 370 BC, 274c-276e, Translation by Fowler 1925, and Kelber 1995, 414). Using the written word during this time and in the context of aeons of aurally transmitted records management culture was rather like everyone keeping a personal copy in today’s automated office environment. Plato further asserts that Socrates believed that written words were antisocial because they segregated themselves from living discourse, suggesting that, much like painting, “writing maintains a solemn silence”; they stare at readers, telling them “just the same thing forever” (Plato 370 BC, 274c-276e, Translation by Fowler 1925, and Kelber 1995, 414).

In the early renaissance, the rise of Gutenburg’s printing press made the written record more accessible and led to the broad dissemination of various religious and political ideologies and propaganda that threatened the powerbase of the aristocratic rulers of Europe. Printing was initially viewed with suspicion and contempt as a means of disseminating unapproved and non-authoritative information, and printing in the renaissance period remained largely exclusive to the literate elite of society. The eventual democratization and broad distribution of written information over time has been largely beneficial to society, while at the same time diminishing the experiential aspect of dialogue and comprehension and contributing to the decline of oral memory records and oral history as true and credible accounts. The advent of writing may therefore be credited with the degradation of the value of the human memory in the management of authentic historical records.

So, just how credible and reliable of an authenticator is the written record? The written record is a subjective snapshot caught in time is static and serves to externalise individual and collective memory. If only one person’s written account of an event survives, that person’s perception becomes the permanent record, complete with that individual’s subjective biases and interpretation.

It must also be noted that the written record is often ephemeral in nature, subject to all varieties of destruction and disaster (fire, flood, political ideology, and disintegration of the materials on which it is recorded). And once destroyed, it is lost forever. Survival of early written records is, therefore, inconsistent and often a matter of chance.

In contrast, the oral record is a living entity and as such the collective oral memory is more reliable by right of common ownership within the entire community or culture. It is by essence collectively ‘authenticated’ and preserved. The oral tradition is derived from the community’s sense of what happened and what is important to preserve. “Memory, not textuality, was the centralizing authority” in cultures based on oral tradition (Kelber 1995, 417). The oral or memory record is passed on as a living entity that changes with new understanding and belief about the event, thereby reflecting the community’s, rather than the individual’s belief about the truth of the record. Rather than being subjective or revisionist, oral records reflect a composite of understanding that is enriched with time and interpretation.

Updating the information contained in oral records is simply a matter of updating your memory or belief about a certain event or idea. According to Ginette Paris (1990),

“The memory at work in oral cultures allows for modification and adjustment, sometimes reversing the meaning of an event. It’s an active memory, which breaks into consciousness through archetypes, dreams and myths, fantasies, symbols and artistic work. It selects and organizes the past, putting into context what is recollected” (121).

Once something is committed to writing, it often becomes the ‘official’ version of the event and it becomes the permanent record. The problem with this process is that the written record is by nature static and inflexible. It may be superseded by another version, whether the original version remains intact or is physically destroyed. So, rather than seeing the rewriting of history in a contemporary context as dangerous revisionism, we can see it as an attempt to recapture the experience of living oral history. The written record can and has been used as propaganda that may seriously alter our perception of past events, especially if only one subjective version is maintained.

For example, many of us view the words attributed to Elizabeth l in the speech at Tilbury of 1588 as an accurate and contemporary record, however the only surviving written account exists in a letter of Leonel Sharp in 1624. The existence of such a record at 36 years removed is concrete evidence of oral tradition at work and accepted into the corpus of so-called authentic written records. The irony of this example is that a culture that itself colonized and dismissed oral tradition has itself relied on orality to lay claim to instances of its own history and culture. Additional cross-cultural examples are found in complex religious belief systems where the devout accept as a matter of faith the accounts recorded, long after from oral tradition, are true renderings of the events. In certain contexts, the disdain for oral traditions can be equated with a cultural racism and the desire to dominate over ‘other’ cultures, providing justification for egregious exercises of power.

In contrast to the ancient world of Homer and other oral-tradition records managers such as the Greek – aoidsz, Anglo-Saxon – Scop, Irish – Poet-Ollam, Italian – Cantastorie, French – Jongleur, English – “Singer of Tales,” the modern world faced by information and records managers:

“is complicated and we are inundated with information as never before. So instead of straining our own frail memories, we arm ourselves with an array of elaborate aides-memoire; rolodexes, filofaxes, palm pilots and of course computers. Such aids have existed in one form or another throughout written culture, . . . their growing use is evidence that the locus of memory itself has left our individual, biological memories and is now part of an “accumulating external memory network” (Cambor 1999, 2).

Over the centuries, writing has alienated a large segment of the world’s population. Indeed, until the 20th century, the majority of the world’s population had been illiterate: their learning and knowledge had been based on oral traditions. It is a significant cultural loss that cross-culturally, with notable exceptions, we have lost the capacity to exercise our brains in more than the basic autonomic processes necessary to encode memory. Today, our capacity to remember and retrieve volumes of information is so diminished that we now seek artificial memory enhancement – evidenced in the growing use of ‘natural’ pharmaceuticals such as Gingko Biloba, and other forms of brain training (Lumosity, for example).

Because they operated in an oral tradition, ancient oral rhetoricians (poets, bards, statesmen etc.) had to train their memories by using devices such as alliteration, rhythm, rhyme, stock epithets and synonyms. Irish poet-ollams (several levels of expertise above a bard) spent at least 12 years of their lives in a poet’s apprenticeship, training and memorising the volumes of tales necessary to their trade (MacManus 1967, 179-180). Genealogical inventories and epic histories were two of the methods employed by oral record-keepers and rhetoricians to classify, store and retrieve information vital to their culture and to their professions, and practitioners were highly regarded and respected within their cultural base.

Elaborate genealogical charting such as those invented by the poet-ollams of pre- alphabetic literate Ireland, conveyed the familial and cultural history of a people through lineage as described in the epic Tàin Bó Cúaligne (The Cattle Raid of Cooley). Futuristic societies such as the Klingon Empire, invented by science fiction writer Gene Roddenberry, are based on the Anglo-Saxon culture that values lineage, heritage and honour above all else. And the Judeo-Christian Bible is replete with genealogical inventories covering hundreds of years of corporate memory. All of these inventories were oral in origin, lasting for hundreds if not thousands of years in that form before being written down.

Cultural history is therefore corporate memory and it is in danger of being lost to our over-reliance on media outside ourselves. Just as oral tradition was largely replaced with written records that may or may not be credible, external technology designed to remember tasks on our to-do lists, or to record long-term memory, is replacing even the simplest brain based records management tasks.

Children raised in cultures built on oral tradition access their cultural heritage through oral records stored in memory banks located in the memory of every member of that culture. Children raised in alphabetic cultures learn that the written word, or contemporaneously the televised image, is the path to self-knowledge and cultural comprehension – televised entertainment as the official record-keepers of our culture. And if you don’t have time, you can just record the information for later, contributing to a possible erosion in our ability to access our innate ability to use our brain to record information and to access when needed at a later date.

Human brain activated and recorded oral tradition/history preserves the corporate memory of families, societies and entire civilisations. Individuals brought up in an oral tradition are trained to process, store and retrieve large volumes of information by activating the enormous capacity of the human brain to organise, store and retrieve information in the form of memories. In oral traditional cultures, the brain’s capacity for memory enhancement, storage and retrieval has been used as the means of training each successive generation to remember their collective history and to preserve that culture’s vital records. Hence, archaeologists were able to find the city of Troy from the account of Homer’s Iliad, a collection of oral traditional stories spanning 8-10 centuries of Aegean cultural history: attesting to the veracity of oral record keeping in the absence of written records.

 

The Canadian Oral Tradition Context

The past disdain for oral veracity in record-keeping is notable in the history of the Canadian Federal and Provincial Justice systems with regard to Aboriginal oral history and oral traditions. However, there has been a subtle shift in the perception of collective memory and oral traditions in recent Canadian jurisprudence. This change reflects a reversion to previous generations’ respect for the orality of information and intellectual discourse. The emphasis on the written record as the primary authoritative record for corroboration of events and transactions is being challenged, with a shift away from the written authority on which western culture has based its educational, judicial and cultural institutions and practices.

The landscape is definitely changing, with such landmark decisions as Sparrow, Guerin, Delgmauukw and T’silhqot’in, in which recognition and respect for the collective memory has prevailed over the judicial requirement for documentary evidence referencing a pre-literate era in Aboriginal societies. Justice Vickers stated in his decision that:

Courts that have favoured written modes of transmission over oral accounts have been criticized for taking an ethnocentric view of the evidence. Certainly the  early decisions in this area did little to foster Aboriginal litigants’ trust in the court’s ability to view the evidence from an Aboriginal perspective (Tsilhoqot’in v. British Columbia, 2007 BCSC 1700, para. 132).

And later,

Rejecting oral tradition evidence because of an absence of corroboration from outside sources would offend the directions of the Supreme Court of Canada. Trial Judges are not to impose impossible burdens on Aboriginal claimants. The goal of reconciliation can only be achieved if oral tradition evidence is placed on an equal footing with historical documents (ibid, para. 152).

This recent development in Canadian jurisprudence also supports the respect for Aboriginal oral history and oral traditions, wherein the majority of early post-contact western society was largely also illiterate in the alphabetic sense. John Ralston Saul (2008) suggests that: “We all understand that in the eighteenth and nineteenth centuries most Aboriginals were illiterate; they could not read and write in European languages. But then neither could most francophone and anglophone Canadians. … our voting citizens were largely illiterate. Our democratic culture was therefore oral” (126).

In the case of Aboriginal record keeping, “ground-truthing became difficult if not impossible to accomplish because there may not have been Aboriginal individuals able to communicate with the authors of the contemporary record in either English or French” (Interview with Howard E. Grant, Executive Director, First Nations Summit Society, September 2016). According to Mr. Grant, context has also been a major impediment to understanding. For instance, the questions: ‘do you live here?’, and ‘are you from here?’, were often unclarified in the sense of the local (house) or broad (region/neighbourhood) context as understood in aboriginal culture.” Mr. Grant referenced as an example the case of BCCA 487, Docket CA0727336 regarding the Kitsilano reserve lands appropriated by Canada for the development of CP Rail services in Vancouver. There are countless known cases of land appropriation that may be linked to the cultural differences in defining what constitutes occupying and/or owning specific parcels of land. Add to that, cultural perceptions about ownership: an insistence that only historically recent paper records could denote ownership, whereas cultural memory, however lengthy, was not legitimate proof of either occupancy or ownership.

Another major impediment to understanding is the obvious loss of meaning in translation from western languages to Aboriginal ones and vice versa, and the subsequent misinterpretation of context within the translation between different cultural norms. The modern practice of anthropology, used in many legal proceedings through expert testimony is, in many cases, flawed due to its inherently western bias, usually requiring some form of independent corroboration of the oral tradition evidence. Essentially the corroboration of generations of oral record-keepers acting collectively  and collaboratively was not perceived as equal to the written record.

Transfer of knowledge in Pacific Coast Aboriginal cultures was once achieved through observation and social interaction within the home and the community wherein the next generations would ‘absorb’ and learn the cultural heritage and complex government systems of their individual tribe (Interview with Howard E. Grant, October 2016). Further, the Aboriginal oral traditions do not allow for ‘shortcuts’ when their content is shared with subsequent generations: as a result, when Potlaches are held, oral traditions are strictly maintained without deviation (ibid.).

With the advent of western cultures in North America, the ‘Si-yém’ (a Coast Salish term, meaning “respected one(s)”, denoting wisdom, knowledge and experience of the individual(s) so named), understood that change was inevitable and they recognised that the younger population must become educated and acquire the tools and training necessary to replace oral record keeping. This was the beginning of a transition away from dependence on oral records in favour of written ones, however, it is still understood that careful recording is essential to protect their culture and history from misinterpretation by non-aboriginal observers (ibid.).

At present, most aboriginal communities are not necessarily seeking to re-establish an oral culture, rather they are striving to maintain their collective memory as it may be required/useful in litigation, as well as maintaining their unique cultural and historical context. They are also seeking validation of their oral traditions insofar as they constitute the pre-contact/pre-literate record of their culture, there really isn’t a past or present divide – the oral record exists in a continuum.

Reliance on the written record in legal matters is also relatively recent in the long history of civilization, being enshrined in evidence legislation as recently as the late 19th century (Canada Evidence Act 1893). In judicial systems, the slavish reliance on written records as corroboration often neglects the fact that written records can misrepresent or lie on matters of historical fact to benefit one party over the other. Certainly in Canada the necessity for written documentation is at odds with historical practice given the contemporary multicultural nature of the emerging pre-twentieth century Canadian populace and the need “relate to power mainly through the oral” (Saul 2009, 127).

 

Memory Systems: Storage

If the “basic characteristic of the human brain is information processing”, (Wang 2003) then those ancient [e.g. Celtic and Homeric Greek] societies and modern oral-traditional- based cultures such as Canadian First Nations, developed sophisticated oral recordkeeping systems by building on the human brain’s natural capacity. They were not anomalies, inventing for the sake of necessity and were certainly not primitive precursors to the written and electronic record-keeping norms of modern culture. Cultural practices that enhance memory storage, such as storytelling, singing and mnemonics encouraged the development of the human brain to act as the receptacle of both individual and corporate cultural memory.

Current brain research informs our understanding of just how successful the human brain is as a storage and retrieval tool. Some studies in cognitive informatics suggest that the human brain is the most viable model for future generation computer and information retrieval systems that do not employ the brain-as-container metaphor: positing that memory is stored and retrieved in a relational manner (Carr 2010: Wang 2003). It seems that there is a “tremendous quantitative gap between . . . natural and machine intelligence”, with the gap favouring human brain capacities (Wang 2003). As a system, the human brain already discards, stores and retrieves information in a manner that is “more powerful, flexible and efficient than any computer system” (Wang 2003). This is consistent with oral record keeping that allowed for the revision of the story based on new information concerning the corporate cultural memory. This was considered best practice in oral record-keeping.

 

 

You have to begin to lose your memory, if only in bits and pieces, to realize that memory is what makes our lives. Life without a memory is no life at all, just as an intelligence without the possibility of expression is not really an intelligence. Our memory is our coherence, our reason, our feeling, even our action. Without it, we are nothing.
~Luis Beñel

 

 

Simply put, the human brain’s neural pathways encode memories. Every time the brain gets new information it compares it to old information and forms new connections. (Arenofsky 2001: Wang 2003). If our brains were really just containers, all the information we start acquiring as babies, would overflow a rapidly depleting capacity and we would have no more room. The human brain would be a storage facility with no structure or capacity for retrieval. We would all be serious hoarders with limited ability to make sense or use of the memories we had acquired and stored and our users would be tripping over the boxes in the hallway, filled with content that had no discernable relationship. Instead, we are able to use our brains to organize in a relational fashion and thereby make assessable to us the content that we need to make sense of our world and our lives. Just like any good data management system should.
But how does the human brain do what it does so well? How do our brains function and allow the development of such sophisticated oral recordkeeping? What part of the brain is responsible for memory formation and why are memories so important to human creative capacity and technological endeavours? What are the implications for future systems development?

Memory formation and storage is a multi-step and layered activity that takes place in the deeper structures and functions of the brain, most importantly the hippocampus that saves short-term episodic memory and prepares them for long-term storage and the neo- cortex, the long-term storage facility (Hsieh 2012). While there are several memory systems, each serving a different purpose (Bendall 2003), there are two structures coordinating multiple activities between the long-term and short-term memory systems. Within the frontal cortex is the short-term or working memory system, storing new information, while keeping us actively conscious of what is being learned and saved. Within this short-term system is the phonological loop (e.g. silent talking to oneself or learning new words), and the visuospatial sketchpad that makes it possible to manipulate images in our minds. And finally, the central executive system keeps us aware of those short-term memories and coordinates both the sketchpad and the loop. The neo-cortex is, in evolutionary terms, a relatively recent addition to our genetics (Rakic 2009). “If any organ of our [human] body should be [seen as] substantially different from any other species, it is the cerebral neocortex, the center of extraordinary human cognitive abilities.” (Rakic, 724). It is the crux of our creativity and the uniquely human biological innovation that allows us to develop long-term memory storage systems that rival modern electronic records management systems.

We know from studying the development of human infants and children, that experience is crucial to the formation of memory. If you hear something just once the neurons often do not release enough chemicals to make a lasting impression on the formation of neural pathways or on memory retention. But if you associate, for example, a telephone number with a visual image such as a person or a place, or the warm chest of your parent with feeling safe and content, then there is more neuronal activity. The more neuronal activity there is, and the more experiences you have to stimulate that activity, the more likely it is that deep memories will be stored in the long-term memory systems of the neo-cortex (Hermann 1993 and Bendall 2003). Episodic memory (experiences) that may be recalled and played over in the mind is a crucial step in the transfer of that recent memory into long-term memory storage.

The hippocampus has been heavily studied concerning its importance to the formation of long-term memory. “A sea-horse-shaped region tucked deep in the folds of the temporal lobe above the ear” (Carmichael 2004, 50), the hippocampus stores recent episodic memories. Those memories are then rehearsed in our minds and eventually stored as long-term memory (probably during sleep) within the neo-cortex (the outer layer of the brain). It is the repetition and recollection that stimulate the   hippocampus   to   start   the   process   of creating long-term memories that can be later accessed. Children that request the same bedtime stories about the events of their day, over and over, are in effect committing to memory a synopsis of the day’s activities in a process of encoding their family history and committing to memory the deeds of the collective. Watching television or videos is generally a non-experiential activity, therefore the accompanying neuronal activity is much less than reading a book, playing in the sandbox, doing a puzzle or telling stories out loud. The type of activity will determine the degree of information retention in the hippocampus, further initiating the transfer of that experience into memory and determining what is eventually available to the person as accessible knowledge in the form of memories.

 

Your memory is a monster; you forget—it doesn’t. It simply files things away. It keeps things for you, or hides things from you—and summons them to your recall with a will of its own. You think you have a memory; but it has you!
~John Irving, A Prayer for Owen Meany, 1989

 

It is now understood that babies are born without the significant neural pathway development associated with the adult brain. The information and experience children are exposed to will create the neural pathways or synaptic connections in their brains, or, not create them in the case of those who are neglected or abused as youngsters. Since children are not born literate, in the alphabetic sense, they spend these crucial few years as learners in a primarily aurally receptive environment, completely dependent on other people to define and determine the nature of their experiences and therefore the type of memory that will be stored in the neo-cortex. Children’s preferred method of exploring the world and learning is through play, not through rote learning or blanket memorization. Neural pathway development and later ability to store and retrieve information in the adult brain is dependent on children being exposed to a wide variety of repetitive and fun activities that stimulate neuronal activity in the brain and cause the creation of neural pathways.

This is not to say that only children have the capacity to form significant new pathways in their neural networks or that later in life memory formation and big learning are outside the capacity of the adult human brain. In the past it was theorized and held as true that once you were past the baby stage, the brain became fixed, with little capacity to learn new things or heal from trauma. Scientists now know that “the brain is a “work-in – progress” (Arenofsky 2001), with a healthy brain exhibiting the capacity to learn, change, grow and create new pathways until our end date. The human brain is essentially scalable. Like a well-built records management system, the brain can accept new information and accommodate new systems, incorporating and working with them in ways that were originally not anticipated.

 

Memory Systems: Retrieval

The human brain makes connections to what it already knows and decides what to keep. No multiple versions of the same memory file are cluttering up our hard drives and getting us into trouble when we send our boss the wrong version. But, we know that it’s easier to forget information that is new, different or that we don’t care about. We rank memories according to their relevance to us and that relevance is determined by how much we care. People can put to memory large amounts of information to do well on an exam and then forget it all in a short period of time or over the summer break. Even  those with so-called eidetic or photographic memory must care or find relevant the information put to memory or they will not be able to retain it. If one had true eidetic memory they would not be able to prioritize or find meaning in the memories acquired and it would be a huge data dump with no retrieval system to help mediate between the brain and the information (Schmickle 2010). We are instead the inheritors of a very sophisticated “command center” the size of a grapefruit, exhibiting extraordinary powers for storage, retrieval, relevance ranking and forgetting (aka culling, pruning, throwing away) (Arenofsky 2001).

How different is the human brain memory retrieval system from a computer? Is human memory similar to the RAM in a personal computer? (Freed 1997, 1). According to Kate Cambor (1999), “Something memorized by a computer is not the same as something memorized by a human.” For memory systems that store the most stable long-term memories it is not known how much maintenance they require. But, it is unlikely that human memory systems need maintenance in the same way as computer RAM, which basically involves the maintenance of current through a circuit. “There is no bio-electric [sic] system running current through our brains,” rather the “electric nature of the neurosystem comes from interactions between individual neurons and the task of maintaining this electric system requires only keeping the cells alive and keeping them connected to their neighbouring cells” (Freed 1997). In effect, we maintain our brain’s records management system by exercising our capacity for storing and retrieving memories: by experiencing life through all of our senses and exercising our natural capacity to manage information. Of course human memory can be disrupted by accident or disease, but the innate capacity of the human brain to store memory is not in question here. Since the hippocampus is critical for normal memory function (Hsieh 2012), we now know that if we damage or develop disease in that deep brain memory making place, then we have compromised our capacity to create retrievable memories in most systems of the memory making areas of the brain (Bendall 2003).

Human memory creation and long-term storage is admittedly a sophisticated and complicated process: but how is it that we retrieve those memories once they are stored? Human memory incorporates a variety of ‘search fields’ in the form of associational cues, in order to retrieve that information. The brain cross-references those memories through a series of sensory triggers that index the relationships between our experience and; images, sounds, smells, emotions, colours, sensations, intonations that we associate in our memories with that experience. In order to function as an information management system, the brain has developed into a sophisticated structure that relies on retrieval schematics or cues such as mnemonic formulas (memory keys) designed to file and retrieve information/data in the form of memory. The neo-cortex, the records management system that houses those long–term memories functions as a “context- dependent rather than location-addressable memory system” (Marcus 2009).

The associational retrieval cues operating in the human brain are the ‘colour-coded labels’ that trigger retrieval of information from long-term memory storage. For instance, your grandmother’s stories from childhood might be inextricably linked to the cue of the scent of baking scones. The difference between human memory retrieval and computerised retrieval is that the brain always maintains the context of the record wherever it files it in our memory, regardless of how difficult it sometimes is to initiate recall. Hence, the taste of an exceptional wine enjoyed in the company of good friends on a beautiful day can never be repeated in exactly the same way and the smell of baking scones is forever linked to your grandmother.

 

 

Nothing is more memorable than a smell. One scent can be unexpected, momentary and fleeting, yet conjure up a childhood summer beside a lake in the mountains; another, a moonlit beach; a third, a family dinner of pot roast and sweet potatoes during a myrtle-mad August in a Midwestern town. Smells detonate softly in our memory like poignant land mines hidden under the weedy mass of years. Hit a tripwire of smell and memories explode all at once. A complex vision leaps out of the undergrowth.
~Diane Ackerman, A Natural History of the Senses

 

 

Memory records also differ from paper or electronic records in that the majority of memory records are not maintained or even delineated into a verbal format. All of our sensory organs are utilised in the creation of a memory record, and are rarely translated into a verbal or written record – human emotion is the prime example, most notably the sensation of fear. Language is an inadequate form in which to relay the complexity of most human perceptions, therefore the ‘hard copy’ or written form of human memory falls too far short of the actual experience.

The human brain is not homogenous as a records management system, but displays the attributes of flexibility and scalability. While the basic structures are the same for all of us there are differences in how memory retrieval systems work in the individual, either due to neurological differences or sometimes from trauma. Animal scientist Dr. Temple Grandin (2016) suggests that her autistic brain functions much like a search engine.

My brain is visually indexed. I’m basically totally visual. Everything in my mind works like a search engine set for the image function. And you type in the keyword and I get the pictures, and it comes up in an associational sort of way (video).

Grandin is not unique in being a visual thinker: as an autistic person she is an extreme example of that mode of memory retrieval. Like all humans, and despite being completely visual in her thinking, Grandin retrieves her memories in an associational fashion, not as items out of a storage box: but as contextual memory, in her case, in image form. This is what all humans do, whether visual, textual or auditory thinkers. Human memories are stored through the process of experience and association. We experience and then we associate other factors to that experience, further encoding it into our long-term memory system. Associations then become the cues that allow us to retrieve the memory after we have created it. How we retrieve it, what our preferred of default method is, visual, auditory or another sense is up to the unique wiring of our brains.

Oral rhetoricians have traditionally employed systems of ‘aidesmemoire’ or ‘memoria technica’ as generalized codes to improve their all-round capacity to remember and as an aid in oral and later in written composition. Simple rhymes are commonplace in many cultures, especially ones that are meant to help children remember basic concepts: “I before E except after C”. Acronyms and acrostics tend to be confused into one in our contemporary usage, with acronyms especially becoming the language of corporations and government throughout the world: IBM, AWOL. Acrostics for remembering the musical scale and how to spell arithmetic: “Every good boy deserves fudge” and “a rat in the house might eat the ice cream” help us to remember not only the deliberately filed information, but also trigger the retrieval of childhood memories of piano lessons and math tutors. The classic block numeric classification scheme that is employed by librarians and records managers world-wide is actually a grouping mnemonic used to classify lists on the basis of some common characteristic(s). Mental imaging or peg is a method of linking words by creating a mental image of them, such as remembering a grocery list by having the items interact with one another in a bizarre fashion that stimulates short-term memory creation. ‘Loci et res’ as a mnemonic system, involves assigning things a place in a space and can be explored through the science of architectonics (Parker 1977). What they all have in common is how they stimulate the memory systems of the brain to store and allow later retrieval of the information in the form of memory.

While the human brain is capable of making sophisticated associations and computations of diverse philosophical, mathematical and logical construction, computers by way of comparison, can make only limited associations based on their programming. Standard records management retrieval systems are therefore limited in their points of access.

Paper files are usually linked to alpha-numeric storage and retrieval patterns and computer databases have a limited number of search fields which require long- term planning and programming to prevent redundancy before the stored information has completed its active life-cycle. For example, you may design a personnel database,  which includes a Social Insurance Number search field, which may become problematic when privacy issues come to the fore in the field of records management. Human memory, on the other hand, is unlimited in its storage and retrieval patterns. It is elastic – a stored record can be retrieved by a wide variety of methods, some seemingly unrelated to the actual information, for example: ‘déjà vu’.

 

 

Right now I’m having amnesia and déjà vu at the same time. I think I’ve forgotten this before.
~Steven Wright

 

 

The human brain has built-in collocating functions, syndetic structure and some pretty awesome authority control. One might think that relational databases used to store information in a records management system, such as a library database or on the internet would handle information in much the same way as the human brain. However, unless the database is programmed to collocate the records upon retrieval and thereby avoid duplication, then the resulting list will resemble those from a search on the internet. Someone has to tell those little bots to collocate, before they collect.

What the classical mnemonists and modern practitioners are doing when they deliberately design systems for memory aid is to take the process that the human brain naturally goes through to create memory records or metadata and enhance it to an extremely sophisticated level of data storage and retrieval. Remember that hippocampus, dependent on repetition and association to get experience/information into long-term memory. The sensory perceptions (images, sounds, smells, emotions, colours and sensations) that are associated with the creation of memory records in the human brain, become access points for retrieval. The context (who, what, where, when and why) plus the content (data, information, experience) form the metadata of the records management system.

Context= friends, wine, conversation, food Content= debate on the divine nature of Christ
Memory Record/Metadata= names, faces, clothing, the lighting in the room, the paintings on the walls, the smell of wood-fire, the taste of the food

  

Human versus Computer

Despite its recent lack of ‘exercise’, the human memory facility is so well constructed and organised that not even Martha Stewart could improve it. While human memory is also susceptible to viruses such as Alzheimer’s and total system failure such as amnesia, it does not require constant upgrades. Even if the human brain alters through evolution, there are not the same problems with transferring of data between software programs and hardware incompatibility despite language, cultural differences and time: the essence of experience appears to be constant. Technology, however, is rapidly replaced, upgraded and elaborated upon and quite often does not allow for retrospective software compatibility or hardware rewiring. The human memory facility remains constant;  you do not have to plug in any new bits of organic matter to make it go faster, better or more colourful. It is inexpensive, portable, space saving, efficient, dust-free and does not require a battery of info-technicians to help you when you have a glitch. Access to electronically stored information, on the other hand, can disappear with a pop and a fizz of a short circuit a virus, or malicious hack.

Human memory records are filed and retrieved through emotive sensations. They engage all of our senses, as we perceive the content of the message. Can you imagine a computer program giving you the feeling of immense contentment when you have retrieved that little bit of data? Because it is the experiential memory that supersedes the factual memory, when you are told a story, the environment has as much to do with the retention of the memory as the tale itself. Human emotive metadata, combined with modern records management storage and retrieval systems, could result in database retrieval systems that mimic the manner in which the human brain creates multi-layered retrieval cues, based on the senses and the emotions, later employed by the brain to retrieve the information.

It is evident that humans possess an incredibly sophisticated method of creating and storing memories that allows us to be creative thinkers and toolmakers. But, is it not a stretch to suggest that our brain’s capacity for such activities can seriously rival an electronic records management system or that it should be considered the template for future development activities in records and information management or in the field of computer science? Consider this:

In 1973, a Canadian psychologist called Lionel Standing showed volunteers a series of photographs of objects for about 5 seconds each. Three days later, the volunteers were shown paired photographs, one that they had seen before and the other new, and were asked to say which was familiar. Standing increased the number of photographs shown to each person to an astonishing 10,000 and still they managed to identify the ones they’d seen before, with very few mistakes. Although this experiment tested whether they recognised something put in front of them, which is much less challenging than recalling something without any external cue, the results suggest that some aspects of human memory are effectively limitless (Bendall 2003, 1).

Mechanistic storage and retrieval within a limited and compartmentalized human brain was the theoretical blueprint that underlay early computer science and that definition was used to theorize and make assumptions about brain development and capacity (Wang 2003). The recent theoretical shift in cognitive informatics towards developing technology based on the human brain with its infinite possibilities for storage and retrieval, has implications for our understanding of how humans organize information and retrieve it. Our current understanding of the human brain can and should influence how we design better systems that use our brain’s capacity for relational data management and relevance ranking. Essentially the human brain could be the model for a scalable solution to information storage and management.

The authors believe that respect for human memory oral records management systems can be restored through a more thorough understanding of the capacity of all humans to store and manage information. Through scientific understanding we can harness those capacities to enhance and design modern records management systems. We know that, at best computers can store a billion bits of information. Human memory is capable of storing one hundred trillion bits of information (Paris 1990, 121) and making up to 500 trillion possible connections among the neurons of the brain (Arenofsky 2001). As suggested by Cambor (1999), most of us are awed by stories of people like:

The Greek statesman Themistocles, who in the 5th Century BC, is said to have been able to call by name all 20,000 citizens of Athens… [Any computer] could accomplish such a task of ‘memorization’ “without even trying, but who would be impressed? Yet, if a person today did anything analogous, who wouldn’t be? (5)

 

Conclusion

Themistocles needn’t be seen as an anomaly in human brain capacity. Armed with a new understanding of how the human brain has and can be employed in oral records management systems, Themistocles can represent that which is largely forgotten or underutilized: the potential of the human brain trained to efficient and effective storage and retrieval.

A computerized database is impersonal in that you may not be the one to plug in the information, but you are the one to retrieve it. You are not necessarily a participant in the creation of the record that is to be retrieved. Databases of the future could allow the creator and the user to interact with one another in the creation of living records that are altered or enhanced with each retrieval or storage of new information. External storage and retrieval systems may benefit enormously from a more extensive examination of human memory with a view to developing systems that are more intuitive and responsive to natural human processes. Human memory is akin to data in five dimensions, layered with perceptions from each of the senses, creating a comprehensive and experiential understanding of the information. It is likely, with the acceleration of technology: that innovations that incorporate some of these factors will emerge in the near future. We already have artificial intelligence and virtual reality enhancements available in the marketplace. It is not such a grand step forward to envision a more comprehensive and experiential human interaction with data through technology.

 

Works Cited

Canada Evidence Act. R.S.C., 1985, C. C-5.
“Interview with Mr. Howard E. Grant.” Interview by Sandra M. Dunkin. Sept., and Nov. 2016.
Tsilhqot’in Nation v. British Columbia, 90-0913 BCSC 1700, 36 (2007).
Arenofsky, Janice. 2001. “Understanding how the Brain Works.” Current Health 1 24 (5): 6-11.
Bendall, Kate. 2003. “This is Your Life…” New Scientist 178 (2395): S4. Cambor, Kate. 1999. “Remember This.” The American Scholar (Autumn). Carmichael, Mary. 2004. “Medicine’s Next Level.” Newsweek 144 (23): 50. Carr, Nicholas G. 2010. The Shallows. New York: Norton.
Connelly, Jim. 1995. “Designing Records and Document Retrieval Systems.” Records Management Quarterly (April).
Connolly, John. 1996. “You must Remember This.” Sciences 36 (3): 2.
Freed, Michael. 1997. “Re: Is Human Memory Similar to the RAM in a PC?” Aerospace Human Factors, NASA Ames Research Center, last modified January 6, accessed September 6, 2016, http://www.madsci.org/posts/archives/1997- 03/852177186.Ns.r.html.
Gaidos, Susan. 2008. “Thanks for the Future Memories.” Science News 173 (19): 26-29.
Temple Grandin on Her Search Engine. 2016. Animated Video. Directed by David Gerlach. PBS Digital Studios: Blank on Blank.
Hermann, D. 1993. Improving Student Memory. Toronto: Hogrefe & Huber. Herodotus. The Histories, edited by Aubrey de Selincourt. 1976. New York: Penguin Books.
Homer. Translated by Robert Fitzgerald. 1963. The Odyssey. New York: Anchor Books.
Hsieh, Sharpley. 2012. “The Language of Emotions in Music.” Australasian Science 33 (9): 18-20.
Kelber, Werner. 1995. “Language, Memory, and Sense Perception in the Religious and Technological Culture of Antiquity and the Middle Ages.” The Albert Lord and Milman Parry Lecture for 1993-1994. Oral Tradition 10 (2): 409-45.
MacManus, Seumas. 1967. The Story of the Irish Race. New York: The Devin Adair Company.
Marcus, Gary. 2009. “Total Recall: The Woman Who Can’t Forget.” Wired 17 (4). https://www.wired.com/2009/03/ff-perfectmemory/
Parker, Rodney. 1977. “The Architectonics of Memory: On Built Form and Built Thought.” Leonardo 30 (2): 147.
Paris, Ginette. 1990. Pagan Grace: Dionysos, Hermes, and Goddess Memory in Daily Life. Dallas: Spring Publications.
Plato. translated by Harold N. Fowler. 1925. Phaedrus from Plato in Twelve Volumes, Vol. 9. London: William Heinemann Ltd. http://www.english.illinois.edu/-people-/faculty/debaron/482/482readings/phaedrus.html
Rakic, P. 2009. Evolution of the neocortex: Perspective from developmental biology. Nature Reviews. Neuroscience, 10(10), 724–735. http://doi.org/10.1038/nrn2719
Saul, John Ralston. 2008. A Fair Country: Telling Truths about Canada. Toronto: Viking Canada.
Schmickle, Sharon. 2010. “Why You Don’t Want the Dragon-Tattooed Lady’s Photographic Memory.” MinnPost.Com, July 8.
Wang, Yingxu. 2003. “Cognitive Informatics: A New Transdisciplinary Research Field.” Brain and Mind 4 (2): 115-127. doi:1025419826662.

Records Management in Canadian Universities: The Present and the Future

By Shan Jin, MLIS, CRM, CIP

 

Introduction

This article presents findings from in-depth interviews with twenty-six records managers, archivists and privacy officers who work in twenty-one Canadian universities. It provides a comprehensive view on current records management practices in Canadian universities. The main topics include program staffing, program placement, records retention schedules and classification schemes, physical records storage and destruction, university records centre, Electronic Document and Records Management Systems (EDRMS), training, outreach and marketing. It also examines the relationships between the records management program and internal stakeholders and identifies the needs for knowledge sharing and collaboration in the academic records management community in Canada.

 

Literature Review

In both Canada and the United States, modern records management started from the federal government. Records management, as a professional management discipline, has been established for more than sixty years (Langemo 2; Fox 1). However, only a small number of scholarly articles were written on records management programs in the higher education environment in North America and even fewer focus on Canadian universities.

From early days, university archival programs often assumed responsibility for records management (Saffady 204). Until recently, many universities’ records management functions still largely resided with the archivist (Zach and Peri 106). From 1990 to 2010, several studies on academic records management programs were conducted by researchers using surveys and interviews. Some were large-scale studies, such as Skemer and William’s 1990 survey on the state of records management whose findings were based on responses from 449 four-year colleges and universities in the United States. Twenty years later, Zach and Peri conducted updated research on college and university electronic records management programs in the United States. Their article presented findings from their 2005 online survey of 193 institutions and interviews in 2006 with 22 academic archivists as well as their 2009 online survey of 126 institutions. Although the focus of these two studies was not on Canadian universities, they provided some comparable data that are referenced in this article.

There were some small-scale studies which complemented the Zach and Peri research. Schina and Wells’ 2002 survey of fifteen American institutions and fifteen Canadian institutions provided relevant information from more than a decade ago, which is cited in the findings section of the article. Furthermore, there were two comparative studies that presented historical information on the records management programs in the University of British Columbia and Simon Fraser University (Brown, et al. 1-20; Külcü 85-107).

Higher education institutions have unique organizational structures and institutional cultures and traditions, which affect how records management programs operate within a university. Since there is a lack of comprehensive studies on records management programs in Canadian higher education institutions, this study will help to fill a research gap.

 

Research Scope and Methodology

Universities Canada (formerly known as the Association of Universities and Colleges of Canada) has ninety-seven member colleges and universities. Since it would be difficult to collect information from all of these universities over a short period of time, the author used a sampling method to decide the criteria for selecting participating universities for the study.

A quick email survey was sent to the records managers, archivists, or privacy officers of twenty Council of Ontario Universities (COU) members. The author asked these universities if they had a formal records management program with at least one employee who worked on records management for a minimum of fifty percent of his or her time. As demonstrated in the survey responses none of the small Ontario universities (with less than 10,000 students) had such a records management program (see table 1). Based on this finding, the author decided that eligible universities for this study would be those with an enrolment size of at least 10,000 students because those are more likely to have a formal records management program.

 

Table 1

Due to limited resources for the study, the author chose to collect data using individual interviews instead of large-scale surveys. Between April 2015 and January 2016, thirty potential participants were contacted via email with a cover letter and a consent form and invited to participate in the study. Eventually, twenty-six records managers, archivists, and privacy officers from twenty-one publicly-assisted Canadian universities agreed to be interviewed. Table 2 lists the number of participating institutions by province.

 

Table 2

Upon receipt of the consent forms from participants, an in-depth 90-120 minute interview was scheduled with each participant. A questionnaire was sent to them ahead of the scheduled interview so they could prepare for it. Interviews were conducted with each participant in three ways: face to face, by telephone or using video conferencing technology. An audio recording was made with the permission of each participant. Eight site visits were also made during the same ten-month period. Additional information was gathered from email follow-ups and from the web sites of the participating institutions. To protect the anonymity of participants, findings of this study reflect group results and not information about specific individuals or universities, with the exception of publicly available information.

 

Findings and Common Concerns

Program Staffing

The study looked at the educational level of the persons responsible for the records management programs. Eighty-eight percent of the twenty-six participants have one or two master’s degrees in library and information studies, archival studies, or history. Thirty-eight percent of the twenty- six participants were hired or moved to their current records management related positions in the last three years. The data gathered from the interviews are listed in table 3. It is shown that the bigger the student enrolment size of the university, the higher the full time equivalent (FTE) number of its records management unit personnel.

 

Table 3


The author also asked the participants the percentage of their time that was devoted to records management related duties, the responses show that many of the participants of this study have responsibilities in areas other than records management. On average, they spend 67% of their time on records management.

 

Records Management Program Administrative Placement

Unlike government agencies and private companies, Canadian universities often have a shared governance system. The academic side directly supports teaching, learning and research functions, and the non-academic side supports administrative functions. Early university records management programs often reported to university archives, an academic unit that is usually a part of the university libraries. Data collected from the interviews show records management programs established in the last decade are moving away from university archives and libraries, and report to a senior administrative department, such as University Secretariat and General Counsel.

Eighteen out of the twenty-one universities participated in this study each have a formal records management program. All five newer programs (<10years) report to an administrative unit.

Older records management programs (>=10 years) have a split, with six reporting to a senior administrative department, seven reporting to an academic department. In total, 61% of the eighteen records management programs report to a senior administrative unit, the rest report to an academic unit (see table 4).

 

Table 4

 

All participants of the study shared their thoughts on the pros and cons of both placements. As summarized in table 5, both reporting structures have their strengths and weaknesses. Archivist William Maher provided some interesting insights in a discussion on academic archives’ administrative location in his book – The Management of College and University Archives.
Maher pointed out there was “no single location that is best for all purposes” (23). He continued to say that too often “attention to the question of location is driven by dissatisfaction with limits imposed by the current parent department and the hope that some other parent would provide better support” (23). Although Maher was talking about archivists’ opinions on academic archives’ administrative location in the hierarchy of the college or university, it seems participants of this study have a similar mentality when it comes to the discussion of records management program’s organizational placement. Regardless of where the records management program is located, the author believes that records managers must capitalize on the advantages and overcome the disadvantages of its organizational structure in order to seek ways to improve records management services. It is important to align efforts from the records management program with other strategic partners such as archives, Information Technology (IT) security, legal department, privacy and compliance office, etc.

 

Table 5

 


Records Retention Schedule and Classification Schemes

 Records retention schedules and classification schemes are the basic component of a sound records management program (Kunde 190). All participating universities that have a formal records management program have established a classification scheme. According to participants of this study, developing records schedules is an ongoing task. Common records schedules are a priority because these schedules are used by all university departments.

Records retention schedules drafting processes in Canadian universities are very similar from one university to the other, but final approval processes vary dramatically. Records schedules are:

  • Approved by a University Records Management Committee;
  • Signed off by a records director, or the president of the university, or a non-records management specific senior committee;
  • Not formally approved by any group in the In Québec, the Archives Act requires that: every public body shall establish and keep up to date a retention schedule determining the periods of use and medium of retention of its active and semi-active documents and indicating which inactive documents are to be preserved permanently, and which are to be disposed of (3).

Also, the Act requires every public body to, “in accordance with the regulations, submit its retention schedule and every modification of the schedule to Bibliothèque et Archives Nationales for approval” (3). Such a process takes a long time; however, the biggest advantage is that the schedules become law. Going through the provincial government approval process gives the records schedules more validation, and compliance to schedules is mandatory in Québec.

In provinces outside Québec compliance to records retention schedules is the responsibility of individual offices and is voluntary. Based on the study findings university records managers often take on advisory or assistance roles. It is not their mandate to be the records management police, for instance, enforcing compliance to retention schedules at a departmental level, but university records managers can encourage compliance by:

  • Defining roles for department/unit heads and staff in records management policy;
  • Providing training and creating tools to assist employees with records management tasks;
  • Using persuasion to encourage employees to use records retention schedules and classification schemes to manage records; and
  • Setting up a departmental records management coordinators network for better communication.

 

Physical Records Storage and Destruction Services

Canadian universities often have a decentralized budget model. When it comes to records storage and destruction, each department or unit is likely to adopt a self-managed solution, but some universities still make an effort to provide a central or a hybrid solution.

Data collected from the interviews indicate that:

  • Four universities have set up a University Records Centre or use a commercial facility for records storage. All activities are monitored by the records management program;
  • Six universities have a hybrid solution whereby departments and units can choose from using a centrally managed storage service or managing records on their own;
  • Many universities use policies and preferred vendors to regulate records storage and destruction activities on campus;
  • Fifteen out of the twenty-one universities (71%) have developed records destruction procedures to formalize records destruction activities;
  • Fourteen out of the twenty-one universities (67%) have preferred shredding service provider;
  • Only two universities have total control of records destruction on campus, and destruction activities are carried out through their University Records

Most of the records destruction activities are self-managed by departments (see table 6).

 

Table 6

 

The responses from the participants of this study indicate that managing physical records is still a major responsibility for university records managers. Despite the decentralised nature of a university’s organizational structure and budget model, the author believes records managers should seek an extent of central control over physical records storage and destruction.

 

University Records Centre

Building or creating a university records centre is one way to gain central control over storage of semi-active records and destruction of inactive records. In Skemer and Williams’ 1990 study, the percentage of American universities which provided records centre storage is 52% (542). Data collected from the interviews show eight out of the twenty-one (38%) universities that participated in this study have their own University records centre or records storage facility.

Table 7 shows some of the services provided by these Canadian university records centres.

 

Table 7

Most of the participants of this study did not feel confident to tackle the daunting task of managing electronic records without the right software tools. One of the best solutions to manage electronic records is EDRMS, which is designed to facilitate the creation, management, protection, security, use, storage and disposal of a range of both physical and electronic documents.

According to responses from the interviews, six out of the twenty-one universities (29%) are providing some degree of central software solution to manage electronic records. However, there are many challenges, especially for a highly decentralised organization like a university. Here are some common ones:

  • Available solutions are too expensive;
  • Lack of a central approach;
  • Offices are more interested in business automation, so records retention and disposition are not their major concerns; and
  • Records management functionality is often overlooked when records management staff only play an advisory role in an EDRMS

One successful EDRMS project is Concordia University’s eDocs project which has been ongoing since 2013. They were able to secure central funding through their Vice-President of Development and External Relations and Secretary-General (Peacock). The solution is free to all faculty and staff but adoption is voluntary. The key to the success is that the EDRMS project is co-led by their records management department and IT department. The project had five full  time employees according to the Records Management and Archives department’s organizational chart from 2015:

  • one Project Manager,
  • two Business Process Analysts,
  • one Change Management Lead
  • and one Archivist / Records Officer.

Phase I of the e-Docs project was completed in 2015 with the application being installed in 300+ users’ desktop computers. Phase II will expand to 500+ users. The EDRMS replaces shared drives and it has a records classification plan embedded in the system. This success story reaffirms that a successful EDRMS project must have high-level support and cooperation from the office of IT or a unit of equivalent function (Zach and Peri 122).

As Kunde suggested, records managers need to engage in activities that position them to be more active partners in managing an institution’s information resources, particularly those that are in an electronic format (189). In order to gain a better control over electronic records, the author believes records managers, privacy officers, archivists and IT should work together to offer some central solutions. For example, this might involve:

  • Standardizing processes for any future EDRMS projects;
  • Establishing a cross-functional team approach to implementing an EDRMS mandatory; and
  • Providing joint training programs with IT on best practices of managing electronic documents and

 

Training

Participants in this study are providing training using both traditional methods and new technologies. There are formal classroom training sessions and less formal information sessions, such as Lunch and Learn. University records managers usually have easy access to many learning management systems. Therefore, they can explore web-based training programs. Many learners prefer this format because they can go online and learn at their own pace. Some of the participants mentioned they started to use multimedia technologies, such as YouTube, Webcast, and Podcast for training.

The content of records management training varies. Records managers choose the content according to the needs of the audience. It can be an introduction to records management, or advanced courses on implementing a file plan, email management, and managing shared drives. Records managers also target different audiences, including senior management, office administrators, records management coordinators, and new employees. Many participants talked about making training a joint effort of the records management program, privacy office and information security office.

Schina and Wells pointed out that systematic training is the key to the success of a records management program (48). The author believes that university records managers should try their best to allocate staff and time to train employees with records management duties.

 

Outreach and Marketing

A records management program hidden from public view is often misunderstood and forgotten by the very people on campus who rely on its services (Purcell 134). The records management program needs to raise awareness among employees and improve its visibility. To achieve this goal, records management staff need to utilize all resources available. This might include:

  • Setting up a records management coordinators network;
  • Having regular meetings with senior management;
  • Generating good word-of-mouth through interactions with employees; and
  • Providing useful tools on the records management web

Through this study, the author has learned some creative methods colleagues in other universities have adopted to market their records management programs, such as using records management- themed coasters, mouse pads, and fortune cookies with records management tips inside etc. A Records Management Day with free pizza turned out to be an effective way to boost employee morale and raise awareness about the records management program.

The author thinks records managers need to find ways to reach employees and promote the records management program, for example, establishing a records administrators group, attending campus events and putting articles in campus newspapers.

 

Records Management Program and Internal Stakeholders

RM and Privacy

The author learned from this study and work experience that Ontario’s Freedom of Information and Protection of Privacy Act (FIPPA) has had a great impact on records management programs in Ontario universities because the basis of the Act is the right to access information held by public bodies, and the right of access depends on the appropriate management and preservation of records (“FIPPA and MFIPPA” 1). In Ontario, four out of the nine university records management programs were launched after Ontario universities were brought under the FIPPA legislation in June 2006. The newest records management program was the direct result of a change in FIPPA in December 2014, which requires every head of an institution take reasonable measures to preserve records in the custody or under the control of the institution.

Participants of the study from outside Ontario also mentioned the great impact of their provincial access and privacy legislation on their own records management programs. Similarly, Skemer and William’s 1990 survey revealed that state law and regulations were important reasons for the creation of records management programs in publicly supported institutions in the United States because legal pressure was probably more persuasive in colleges and universities that rely principally on public financing (537; 545).

Records management programs and privacy offices often have a collaborative relationship. According to data gathered in this study, seven out of the eighteen (39%) records management programs report to the same senior administrative office as their university’s privacy office.

Twenty-seven percent of the participants have both records management and compliance support duties. Many participants of this study agreed that the FIPPA legislation was a strong driving force for records management programs. The records management program and the privacy office share the same goal of educating people on best records management practices, but with some different emphases.

 

Rm and Archives

Many Canadian universities’ records management programs emerged from university archives. Data from this study show eleven out of the eighteen (61%) university records management programs are placed in the university archives or in the archives and records management joint department. Forty-two percent of the participants have both records management and archives duties.

University records management staff often seek input from archivists regarding records retention and disposition. Archivists understand the value of the records management program to their own program; records management staff act as advocates for the archives, and make sure archival records will be transferred to archives. In the author’s view, the records management and the archives programs are natural allies who share common interests. This is especially true when university archives have the mandate to collect institutional records. As Purcell pointed out a strong relationship between the records manager and the academic archivist is a sign of a successful records management program (134).

 

RM and IT

In their 2002 article, Schina and Wells stated both U.S. and Canadian university records managers wished to develop a closer relationship with their IT colleagues and participated in electronic records management decisions (43). Fourteen years later, it remained a concern for the participants of this study. During the interviews, many mentioned that there was a disconnect, and sometimes, miscommunication between the two departments. For example, IT staff and records managers may understand the word “archiving” very differently. Many participants said when IT was leading an EDRMS project, such a disconnect and miscommunication often led to exclusion of the records management staff from EDRMS initiatives on campus. Sometimes, records management became an afterthought.

In order to improve this situation, the author thinks record managers should try to build a mutually beneficial relationship with IT. The two units do have some common interests, including EDRMS, cloud computing, and information security. Records managers need to show what the records management team can bring to the table. For example, records management professionals are experts on data retention, advocate reducing server storage space and eliminating duplication. When IT sees the potential benefits of collaborating with the records management team, it is more likely that they will be willing to work with records management staff on EDRMS related projects. There is a need to align records management efforts with IT efforts.

In summary, experts from records management programs, archives, privacy offices and IT departments can bring their unique perspectives into conversation about managing a university’s information assets. It is important to create and maintain strong partnerships among all these internal stakeholders in order to achieve the university’s overall goals of information governance.

 

 Communication and Collaboration (External)

Kunde pointed out that with the reality of small records management programs and low levels of staffing efforts to collaborate can benefit everyone (205). During the interviews, many participants expressed their interest in seeking collaborative opportunities with colleagues at other universities.

Universities in the same province are governed by the same laws and regulations. In the author’s opinion, there are opportunities for knowledge sharing and collaboration among university records managers. In British Columbia, four universities started to have monthly teleconference calls in 2015. In Ontario, the Council of Ontario Universities (COU) provides a communication platform based on the existing Records Managers Group web portal and mailing list. Records managers of COU members also started to have regular teleconference in spring 2016. These activities indicated a good start.

Any collaboration initiative needs a strong leadership, funding and human resources. There are good examples from the academic library communities, such as the Ontario Council of University Libraries’ Collaborative Futures project. Collaborative Futures maximize the existing expertise and resources among Ontario academic libraries. One of the projects is to manage and preserve print resources. Five Ontario universities (McMaster, Ottawa, Queen’s, Toronto and Western) undertook the project to consolidate physical library materials into one shared space at University of Toronto Libraries’ Downsview property. The Downsview facility implemented a High-Density Racking System and Mechanized Retrieval to ensure orderly retrieval of low-use print material (“Improving” 8). Access to the entire collections is provided by an online request service, supported by a daily courier (“Welcome”). Such a model can be used for storing physical university records as well. For example, the University of British Columbia’s Library PARC is a similar high-density storage facility, but it is used for both low-circulation library collections and university records (“Library Parc”).

University records managers can also learn from the collaborative efforts of Ontario municipalities. In the 1990s, Ontario’s Municipal Freedom of Information and Protection of Privacy Act (MFIPPA) came into effect. One company, the Information Professionals, worked in conjunction with The Association of Municipal Managers, Clerks & Treasurers of Ontario (AMCTO) and developed TOMRMS (The Ontario Municipal Records Management System).

TOMRMS is a model system for managing records with a classification and indexing scheme, retention schedules and citation tables which refer to applicable laws. As of 2016 TOMRMS has been used by about one hundred Ontario municipalities. Can smaller academic institutions with newly established records management programs collectively outsource the task of developing classification schemes and records retention schedules? Eventually it is up to the university community to seek common interests and start the conversation.

 

Conclusion

Canadian universities have been practicing records management for more than half a century. The University of Toronto was among the earliest English-language Canadian universities which established a records management programs. In 1989, the University of Toronto’s president passed the Presidential Regulations for the Management of Archives and Records, which marked the launch of its records management program (“About the Records Management Program & Services”). Today, larger institutions like University of Toronto generally have an established records management program, while smaller institutions only started to develop their own records management program in the last decade due to legal requirements. Provincial access and privacy legislations often have had a significant influence on academic records management programs because the right of access information depends on the appropriate management and preservation of records. (“FIPPA and MFIPPA” 1).

Although data were collected from only twenty-one academic institutions (a sample size of 21.6% of all Canadian colleges and universities), the results of this study can still reveal some general patterns in current recordkeeping practices in public Canadian universities such as: a clear trend of relocating the records management program outside an academic unit, such as the university libraries, which mainly supports teaching, learning and research. Among the records management programs surveyed in this study, about 28% of them were established in the last decade, all of which are reporting to a senior administrative office. In total, 61% of the programs are located in a non-academic unit. Many believe that placing the records management program in a senior management unit can raise its profile as the university-wide records management decisions made by senior administration are often perceived as having more authority.

Most Canadian universities with limited resources can only provide basic records management services. Records management programs mainly play an advisory role in a highly decentralised university environment. None of the participating institutions’ records management programs have a mandate to carry out records management audits in order to ensure compliance with approved records retention schedules and other internal records management policies. Adoption of records retention schedules is often voluntary. However, the unique schedule approval process in Québec makes compliance to records retention schedules mandatory in that province.

Although universities often operate in a decentralized fashion, records managers are making an effort to provide a degree of central control by developing common records retention schedules and campus-wide records management policies and procedures, and providing centralised or semi-centralised records management solution for both physical and electronic records. However, implementation of EDRMS solutions is still in its infant stage. Many universities simply do not have the human and financial resources to run an effective, university-wide electronic records management program.

The study also shows there is an increasing need to build and enhance connections among records professionals within the academic records management community. Today’s academic records managers are facing the same challenges and working with limited resources. The future of continuous success in university records management programs requires records professionals to actively explore solutions to bring local records management expertise together for knowledge sharing and collaboration.

 

Works Cited

“About the Records Management Program & Services.” University of Toronto Archives and Records Management Services, University of Toronto. utarms.library.utoronto.ca/about-utarms/rmps. Accessed 15 Oct. 2016.
Archives Act. c. A-21.1. Québec. Bibliothèque et Archives nationales. Government of Québec.
1 Nov. 2016, legisquebec.gouv.qc.ca/en/pdf/cs/A-21.1.pdf. Accessed 16 Aug. 2016. Brown, Helen, et al. “Records Management in the University of British Columbia and the
University of Pennsylvania.” UBC Social Ecological Economic Development Studies (SEEDS) Student Reports. 25 Nov. 2010, pp. 1-20. open.library.ubc.ca/cIRcle/collections/undergraduateresearch/18861/items/1.0108622. Accessed 15 Oct. 2016.
“FIPPA and MFIPPA: Bill 8 – The Recordkeeping Amendments.” Guidance Documents, Information and Privacy Commissioner of Ontario, 22 Dec. 2015, www.ipc.on.ca/wp-content/uploads/Resources/Bill8-New-Recordkeeping- Amendments.pdf. Accessed 15 Oct. 2016
Fox, Uta. “The History of Records Management in Canada, 1867-1967.” Sagesse, Journal of Canadian Records and Information Management, Spring 2016, pp. 1-11. www.armacanada.org/index.php?option=com_docman&view=download&alias=43-the- history-of-records-management-in-canada&category_slug=canadian-rim&Itemid=458. Accessed 14 Aug. 2016.
Freedom of Information and Protection of Privacy Act. R.S.O. 1990, c. F.31. Ontario. Minister of Government and Consumer Services. Government of Ontario. 2016. www.ontario.ca/laws/statute/90f31. Accessed 8 Aug. 2016.
“Improving Efficiency at Ontario Universities: A Report by the Council of Ontario Universities.” Council of Ontario Universities, Dec. 2015, cou.on.ca/wp- content/uploads/2015/12/COU-Improving-Efficiency-at-Ontario-Universities- Dec2015.pdf. Accessed 15 Oct. 2016.
Külcü, Özgür. “Records Management Practices in Universities: A Comparative Study of Examples in Canada and Turkey.” The Canadian Journal of Information and Library Science, vol. 22, no. 1/2, 2009, pp. 85-107. Accessed 22 July 2016.
Kunde, Nancy M. “Reframing Records Management in Colleges and Universities.” College and University Archives: Readings in Theory and Practice, edited by Christopher J. Prom and Ellen D, Swain, Soc. of American Archivists, 2008, pp. 185-208.
Langemo, Mark. Winning Strategies for Successful Records Management Programs. Information Requirements Clearinghouse, 2002.
“Library Parc.” University of British Columbia Library, about.library.ubc.ca/changes/libraryparc/.
Accessed 15 Oct. 2016.
Maher, William J. The Management of College and University Archives. Soc. Of American Archivists, 1992.
“Organizational Chart: Records Management and Archives.” Concordia University, 5 May 2015, www.concordia.ca/content/dam/concordia/offices/vpdersg/docs/RMAD-org-chart.pdf.
Accessed 15 Oct. 2016.
Peacock, Tom. “eDocs project promises improved electronic documents management.” News Stories, Concordia University, 19 June 2013, http://www.concordia.ca/cunews/main/stories/2013/06/19/edocs-project-promises- improved-electronic-documents-management.html. Accessed 15 Oct. 2016.
Purcell, Aaron D. Academic Archives: Managing the Next Generation of College and University Archives, Records, and Special Collections. Neal-Schuman, 2012.
Saffady, William. “A University Archives and Records Management Program: Some Operational Guides” College and Research Libraries, vol. 35, no.3, 1974, pp. 204-210.
Schina, Bessie and Garron Wells. “University Archives and Records Programs in the United States and Canada.” Archival Issues, vol. 27, no.1, 2002, pp. 35-51.
Skemer, Don C. and Geoffrey P. Williams. “Managing the Records Higher Education: The State of Records Management in American Colleges and Universities.” American Archivist,
vol. 53, no. 4, 1990, pp. 532-547.
“Welcome to UTL at Downsview.” University of Toronto Libraries, onesearch.library.utoronto.ca/downsview. Accessed 15 Oct. 2016.
Zach, Lisl and Marcia Frank Peri. “Practices for College and University Electronic Records Management (ERM) Programs: Then and Now.” American Archivist, vol. 73, no.1, 2010, pp. 105-128.

 

From Promise to Fulfillment The 30-Year Journey of Electronic Recordkeeping Technology

By Bruce Miller

 


This is the inside story of the development of electronic recordkeeping software technology from its inception to its current capability and status today. I don’t want this to be just a history lesson, but an objective and factual story of what worked, what didn’t work, and where it has led us. I’ll also try my best to predict where we need to go in the future, based on where we are today. This story contains plenty of my opinion, and please remember, it’s just my opinion! I hope you enjoy the story….
– Bruce Miller

 

I see three distinctive phases, or stages, of electronic recordkeeping technology development to date. They are:

  • e-Records Software: 1983 – 2002. The birth of electronic recordkeeping software. The early, uncomfortable pioneering days, and the introduction of a US software standard that forever changed the marketplace.
  • Document Management: 2002 – 2012. This is the period when the large, leading document management software firms made their move to incorporate recordkeeping capabilities, and changed not just the technology, but how such solutions were delivered to the marketplace.
  • Microsoft: 2012-present. As a single company, Microsoft’s pervasive presence on our business systems across the globe means its influence is, to put it mildly, outsized. Microsoft’s SharePoint ECM platform began to gain significant ground on the traditional ECM providers. This stage marks the period when Microsoft finally realized that, like the established ECM players, they too needed to deliver recordkeeping capability. While their competitors already had well-established recordkeeping capabilities, Microsoft moved to make recordkeeping a fundamental cornerstone of SharePoint. For better or for worse, their move led to massive changes in recordkeeping software technology, and ultimately to what I believe has been a ground-breaking innovation that broke through a key barrier that held us back in the earlier stages.

 

The Three Stages of the Market

e-Records Software

This stage begins in 1983, when a few of Canada’s federal government agencies, led by a group known as the Canadian Workplace Automation Research Centre (CWARC), awarded a handful of fairly significant grants to up-and- coming technology firms to innovate new technology for “office automation”. Each grant was sponsored by a different federal department. The grants formed part of a contest of sorts – each grant would fund a proposal for a research and development project. Each resultant proposal would be judged and a winner selected. The winner would receive an additional grant from the government to hopefully spur the commercialization of the proposed technology. I was one year out of school, and working for a Montreal-based firm building the world’s first portable computer, Bytec’s Hyperion, in Ottawa.

This firm was a recipient of a grant sponsored by National Archives of Canada (NAC) and the Department of Communications (DOC). DOC wanted to stimulate technology industry growth. NAC wanted new technology capability to manage electronic records. It fell to me to put together a proposal for a new technology, so I formed a team of young software specialists from my company, and key stakeholders from NAC and DOC. NAC would take the lead on defining requirements. DOC would be an example of a federal government consumer of the new technology, and help with requirements definition. We called it Foremost (Formal Records Management for Office Systems Technology). Our proposal won the contest, but by the time any formal project could be formed, the company I was working for collapsed around us. The future was bleak, but we had this great idea for new software.

My team and I decided to form a new company of our own to build the technology we proposed. We each threw in what small amount of cash we could muster, and we had ourselves a company. We secured venture funding to sustain our operations while we set about building this new software. NAC and DOC held out hope we could somehow, someday, eventually deliver a commercial product they needed, so they continued giving us guidance and direction on features we needed to offer. The software gradually grew to the point where we could actually deliver an early version of it. Meanwhile, a local Ottawa-based software consulting firm joined the race by developing a competitive product. By now we were also hearing of a second competitor out of Australia.

Undaunted, we continued to grow our company, and NAC tested and validated the software. We were beginning to receive much attention throughout the USA, along with increasing interest around the world, mostly from government agencies who were realizing that more and more of their records were in electronic form, yet they had no way to apply their obligatory recordkeeping control to these records. NAC Director John Macdonald (now retired), was a key internal champion of the project, and he undoubtedly influenced archives departments of other countries with his efforts to push for this type of technology. It has been said that through Macdonald, NAC’s work became a key factor in the International Council of Archive’s (ICA) eventual standard for electronic recordkeeping.

We were a typical small software start-up in so many ways. Everything we did was “the first” – nothing like this had ever been built before. We had competitors; however we knew we were way out there on the innovation curve. We steadily developed the product with new releases every year. We were starting to get a few sales from early adopters, but no sales of sufficient size to make the company self-sustaining, let alone a financial success. We were getting a great deal of interest and promise, including from the US military and large corporations, but many potential buyers seemed reluctant to buy from a small Canadian software start-up.

Before I knew it, nearly a decade had passed. Our competitors were growing stronger and adding features, as were we. Our advantage however was venture funding – we had sufficient capital to grow a strong development team focused on building innovative new features, without having to wait for sales revenue.

Then the US Gulf war happened. Returning war vets complained of a never-before seen illness that came to be known as “gulf war syndrome”. What does gulf war syndrome have to do with electronic recordkeeping? A lot, as it turned out. Gulf war vets launched a class action lawsuit against the US military for compensation. I don’t recall the precise outcome of the litigation however I know it cost the US military a great deal of taxpayer’s money. The Pentagon subsequently concluded that the failure to recognize the disease was caused in large part by the destruction of soldiers electronic medical records on battlefield computers. In an effort to prevent the loss of valuable electronic records in the future, they mandated that all US military agencies must now manage their electronic records properly. In an effort to hasten the transition to electronic recordkeeping, they established a mandated software standard — US DoD-5015.02-STD that specified that all records management software systems used to store electronic records must meet the requirements laid out in this standard.

The pentagon went further. They boldly decided that they would form a testing agency to certify vendor software products to ensure the software met the stringent new standard. This new testing body was to be housed in Fort Huachuca AZ, under the auspices of the Joint Interoperability Test Command (JITC).

Over time, this standard turned out to be a complete market game changer. Now the US military, arguably the largest consumer of recordkeeping technology in the world, could only buy technology that met this new standard. The US National Archives and Records Administration (NARA) embraced the standard and recommended that all US government agencies comply with it. This set off a chain reaction of “me-too” behaviour. Before long, plenty of US state and local government agencies adopted the standard. Even many corporations jumped on the bandwagon.

Their Requests for Proposals (RFPs) to purchase recordkeeping software listed 5015.02 compliance as mandatory. This would eventually spread to Canada and Europe. I estimated at the time that nearly half of all Requests for Proposals (RFP) for recordkeeping software listed 5015.02 compliance as mandatory.

There was just one problem.

There was no software to test. No software had ever been written to this new standard. I and my team provided substantial input to the team developing the standard; however JITC was very careful to remain neutral of any single vendor. I remember many long debates with the team about what was possible versus what was practical. The resultant standard was a very daunting list of capabilities never seen before. I felt our new software met the bulk of the core requirements, but a great deal of work would be needed to meet the standard. The race was on to get certified against the new standard. And we were not the only company – everyone in the recordkeeping software business now had to get certified, or risk losing key market share.

I knew this was an all-or-nothing gamble for our company – we had to do it. So we doubled down on venture funding to build up our team with more skills and resources, and we scheduled the first certification test of our software in Fort Huachuca.

There was just one problem.

How was JITC to test software they had never seen? Test procedures had to be written. To write the test procedures, JITC had to know what they were testing. But they had never seen any software like this before. They were new to recordkeeping, let alone to this new breed of software. To compress a long story into a few short milestones, I found myself routinely visiting the Fort Huachuca labs, on the Mexican border, over a period of several months. I had become accustomed to the folks at the lab, and they were familiar with me. Being an active military base with a drone airport, the fort was a typical bustling US military base and I fit in with the many other contractors and civilians working on day passes.

Once the test procedures were finalized, the big day came for the test of our software – the first ever for the JITC team. I showed up and joined the team at the lab, as usual. Security was unusually rigorous however. The US had launched the Second Gulf War since I was last at the lab. Unbeknownst to me, each base raises its security level when the country goes to war. I was stopped in the hallway by military police and asked to show my ID. My Canadian passport was all I ever needed to show in the past, but not any longer. On day one of the test, I found myself literally out in the Arizona Desert with my Canadian passport in hand. My laptop, with all of its test data and test procedures, had been confiscated and was to be erased forever. I will never be able to tell the story of how we recovered from that, but suffice to say it involves really fine folks from the US military, hot desert, and a lot of Kentucky Fried Chicken and Coca-Cola over many days.

And so the world now had its first 5015.02-certified software. Other software firms were lining up to schedule their certification tests, and JITC was booked up with tests for the next 2 years. The world of electronic recordkeeping software would never be the same. The primary focus of the software became 5015.02 certification. This standard obligated all the vendors to build these features into their software, or risk losing market share. This standard ultimately shaped the overall architecture and feature set of all compliant software. Our local Ottawa competitor went on to get certified, as did our Australian competitor.

Once the US standard had been well established, Australia followed up with their own standard called the Victorian Electronic Records Strategy (VERS). It would be several years before the Australians developed a software certification testing regimen to accompany the standard. Other standards began popping up:

  • ICA, Module 2 Principles and Functional Requirements for Records in Electronic Environments. This eventually became ISO 16175-2:2011.
  • MoReq (MoReq1, MoReq2). Originally intended as a European

It appeared to me that each of these standards was attempting to somehow outdo or at least add onto the US standard. At the time, I felt the US DoD 5015.02 standard to be absolutely essential to sell into the US market.

MoReq appeared to me to be asking for too much of the software, such as the ability to “preserve the readability of the electronic document forever”. Good luck with that! I struggled to take it seriously, and I never encountered a single case of a buyer demanding compliance with it. I admired the VERS standard, as it indeed built on 5015.02, but once again it seemed to only matter to Australian buyers, whereas I, and I believe most of my competitors, were focused on the US market.

So what about Canada? Our federal government, via Treasury Board of Canada, eventually selected ICA Module 2 as the standard for government of Canada software. This standard is very different from 5015.02 in that it’s written from the perspective of an archivist, primarily concerned with capabilities that support the preservation of the records. The ICA standard simply is not as focused on the active lifecycle of the records –much of what it calls for is downstream of the active lifecycle. Additionally, most of the standard’s input came from Commonwealth countries, and it really shows. The old-fashioned block-numeric classification system, and the security levels (Confidential, Secret, Protected Top Secret, etc.) are reflected in it. ICA Module 2 calls for a breathtaking 275 capabilities versus 168 with 5015.02, some of which, similar to MoReq1, were simply aspirational and impossible to deliver with today’s technologies.

I’m not advocating one standard above another – all standards serve a useful purpose. I worry however when I see the word “should” in a software standard. You cannot test a software feature against a “should” requirement. To me, a software standard is only a software standard when real-world technology can be tested against it. All other standards are aspirational, and no software vendor should ever claim compliance with an aspirational standard, but some do.

How were the e-records software firms doing? The Australian firm was doing the best of the three I’ve been mentioning, as they appeared to be selling more software (albeit mostly in the Pacific Rim Region), and the company was growing larger and faster than the others. They were concentrating on the local Australian and European market, and doing well. The company grew to around 35 people or so. They were somewhat late in obtaining US DoD 5015.02 certification, but eventually they set up a small beachhead in the USA by hiring a well-respected former US air force pilot, later turned military records program manager, to represent them in the US. They were primarily delivering solutions aimed at physical (paper folders, boxes) records solutions. The management of electronic records always appeared to me to be a secondary effort for them. Similarly, our local Ottawa competitor largely sold paper- based recordkeeping solutions, with electronic recordkeeping as a secondary effort. They soon obtained 5015.02 certification, and won acceptance for the Canadian government in a “lowest-wins” bidding competition that would eventually be known as RDIMS (Records and Document Information Management System)1. While we were disappointed in this, we were convinced the US market was where our future would be written, not in Canada. Unlike all of our competitors at the time, our solution was purely for electronic records, with physical records as a secondary feature set – the exact opposite of our competitors.

Why did we not focus on paper records? Because our mission was to deliver solutions for electronic records, not paper records. The world was awash in competent paper records handling solutions. We stayed laser-focused on that mission!

The first 5015.02 certification galvanized the entire electronic recordkeeping software industry. Potential buyers were calling for compliance with the standard. If you didn’t meet the standard, you could not sell software to much of the US market. And for a while, only one small Canadian software firm met this standard. The big document management software companies were suddenly paying attention….

 

Document Management

While e-records software was unfolding from 1983 – 2002, there was a parallel market developing at the same time

  • Document Management. These were products that provided an organized repository for all the electronic documents that were being created, and a powerful search capability so users could find the documents they needed. It made no sense for all users in a business to store their documents only on individual computers. The demand for Document Management was growing rapidly. Only this was no small business like electronic recordkeeping – this was truly big –really big – By 2002 this type of software was becoming pervasive in organizations around the world. It seemed then, as it does today, that pretty much all organizations larger than a few hundred users need some form of document management.

Document Management had grown in market install base as well as capability, and had by 2002 morphed into “Content Management”, or Enterprise Content Management (ECM) to be exact. IBM had its Content Manager platform. OpenText started out with its LiveLink ECM offering and through several acquisitions, including Canadian firm Hummingbird, grew its Content Server ECM platform. FileNet had its well-known P8 ECM product, heavily focused on image management. But the granddaddy of them all was Documentum out of California. Documentum completely dominated the pharmaceutical industry with large-scale ECM solutions for big-pharma companies around the world. If it was huge, it was usually Documentum. IBM was no slouch either – they had installations all around the globe, including one well-known installation (US Social Security Administration) of over 200,000 users. Hewlett Packard (HP) soon entered the ECM market with their acquisition of Autonomy. Remember that all of these “new” ECM platforms were merely the continuing evolution of document management – just with a fancier name and ever- increasing capabilities. Records management software meanwhile was a tiny, specialized industry being developed by a small number of relatively tiny software vendors. These two markets had yet to intersect in any meaningful way.

ECM was ECM. Records management was records management. And that was that….

Market consolidation has been relentless in this ECM market segment. Documentum was acquired by EMC, and EMC has since been acquired by OpenText. IBM has acquired FileNet.

These companies paid little attention to the electronic recordkeeping market – they had no reason to. However, with the first certifications of 5015.02, they were seeing “compliance with US DoD 5015.02” appearing in the “Mandatory” column of the RFPs they were responding to. Many of their US government buyers and prospects were telling them “We can no longer purchase your products unless they are 5015.02 compliant”. In Canada, and doubtless in other countries, compliance with 5015.02 was appearing on a good number of RFPs for ECM technology.

There was just one problem.

What the heck was this 5015.02 thing? ECM vendors were completely caught off guard – they did not see this coming. They had two choices. – build, or buy. They could design and build these strange new features into their products, and ultimately achieve certification. Or, they could buy existing technology and incorporate it into their own products. Remember that by this time, we had been designing, building, and perfecting this technology for over a decade, with 100% exclusive laser focus on this market niche. It took an enormous amount of specialized knowledge, a large team of highly skilled developers, and a great deal of time, effort, and money to achieve 5015.02 certification. This work could not possibly be done quickly or easily, even by mighty IBM.

ECM vendors IBM, FileNet, Documentum, and OpenText all needed this certification – and the sooner the better, as their lack of certification was having a negative impact on their sales quotas. There were three potential acquisition targets – my company in Ottawa, our local Ottawa competitor, and our friends in Australia. Four companies who needed certification in a hurry, and there were three small niche software companies who had the technology.

Here’s what happened. I left my own company and formed a new company to build a second generation of the technology we felt was essential to success. My previous company was acquired by Documentum. My new company was acquired by IBM. My local Ottawa competitor was acquired by OpenText. The Australian company decided to hang tight and make a go of it on their own. FileNet, left at the altar with no bride, and had no choice but to build recordkeeping features themselves, which they immediately set out to do. Several years later, the Australian company was finally acquired by Hewlett-Packard.

With the notable exception of the Australian firm I’ve been referring to, the nascent e-records software business essentially disappeared almost overnight, having been swallowed up in the rush by the ECM vendors to become DOD 5015.02 compliant before their competitors. The few small specialized vendors had now been swallowed up – there were none left except for the Australian firm. The ECM vendors were now calling the shots. To this day I believe this is the right thing for the market. I came to learn that organizations with recordkeeping requirements have no choice but to buy software that meets their needs (usually expressed as 5015.02 compliance), but that doesn’t necessarily mean they end up using the features they buy. Recordkeeping was now just a “feature” of modern ECM. If you wanted electronic recordkeeping, you had to buy ECM first. Recordkeeping became a mandatory check-box in most RFPs for the acquisition of ECM. ECM sales kept climbing, and everyone was happy.

I cannot speak for the other acquisitions, but IBM did a fantastic job of absorbing my company. It meant a lot to IBM and it was extremely well executed. But I was anything but happy. After a few years, I noticed that despite all the sales and delivery of electronic recordkeeping solutions, few buying organizations were actually deploying the solutions. This was not just within IBM – but right across the market. ECM was being deployed with recordkeeping capability, but I could find no evidence that the new ECM recordkeeping capabilities were being used to truly manage electronic records.

I encountered many organizations who claimed to be managing their electronic records. When I looked into their projects, several disturbing things emerged time and time again. Many (too many) organizations had no idea how to go about deploying electronic recordkeeping. Some had included recordkeeping on their ECM orders because they had to, but never deployed it (shelfware). Many tried and failed to deal with electronic records, and reverted to managing paper records only, leaving electronic behind as unmanaged documents. I met one large US firm, a household name, who tried three times and spent millions trying to deploy electronic recordkeeping with two different ECM vendors, and eventually gave up in frustration.

To me, a successful deployment of electronic recordkeeping is defined by two simple criteria. First, electronic documents are being properly managed as records on a regular, daily basis by all users. And secondly, the corporate records manager is carrying out disposition of electronic records, in accordance with the retention schedule. On countless occasions, I heard stories of successful electronic records management. But once I peeled back the project’s layers to seek out my two success criteria, I’d never find them. Ever.

The ECM software obviously met recordkeeping requirements, at least as laid out in 5015.02. Many ECM vendors went well beyond 5015.02 and delivered features such as content searching, e-discovery, and physical records handling. Most even delivered some form of email management, a critical component of electronic recordkeeping, as so many electronic records are emails. The ECM platforms were delivering strong, powerful features. Successful ECM solutions were being delivered from large, global companies with highly competent people.

So why wasn’t the electronic recordkeeping portion of these ECM solutions successful, even if the underlying ECM project was completely successful? As it turns out, there were a number of clear, obvious causes. Over the years, I began to notice the root causes (project “factors”) across many different projects, across different ECM technologies, and across different types of organizations, that were preventing electronic recordkeeping from taking root and succeeding. Sometimes, a single factor alone was fatal to the recordkeeping. Sometimes a combination of multiple factors were present. Either way, the outcome was not good – electronic recordkeeping was not taking place.

Following are the four project factors I had been seeing, and I am still seeing them to this day:

 

Low Recordkeeping Priority. A move to enterprise-wide ECM is a large, costly, and risky adventure for any organization. It represents a major shift in the organization’s Information Technology (IT) landscape and is extremely disruptive to users. A great deal of money and reputation is invested in ECM projects. It’s important that they are successful, and the tolerance for failure is low. In some cases, IT people feel their reputations, and even their positions, are at risk in the event of a failure. Not so with the recordkeeping aspect of the project. Recordkeeping is just “a minor part of the project”. Nobody will wind up an ECM project if the recordkeeping element of it is not satisfactory. Nobody’s reputation or job is on the line. If the recordkeeping part of a corporate ECM deployment fails – then ECM life goes on, without recordkeeping.

Relative to the overall ECM project, the importance of recordkeeping is small. In many organizations, the ECM is “too big to fail”, which renders the recordkeeping side of the project essentially disposable. In my experience, and this continues to this day, far too many ECM deployments have failed with electronic recordkeeping, and the corporate RIM professional is left managing paper records with the sophisticated electronic recordkeeping features sitting idle.

ECM Vendors know that as long as they meet requirements, they can sell their products. If the recordkeeping does not pan out, they’re not likely to see too much heat from a buyer who, in too many cases, doesn’t understand the technology well enough to understand how to deal with it.

In my view, this is an unintended consequence of recordkeeping being a feature of ECM. At the outset, everyone agrees that recordkeeping is a “must have” feature. So ECM proceeds. Later, recordkeeping fails. Are we going to throw out the entire ECM project and start over? Of course not. ECM is safe – it is always safe, and recordkeeping is not.

 

Ill-Equipped RIM Professionals. Some RIM professionals misinterpret my position on this and feel I’m saying RIM professionals are not able to handle this technology. But nothing could be further from the truth. Today’s RIM professionals are more tech-savvy, better educated, and better equipped than ever to deal with technology. But despite this, many of us are still sitting ducks in front of advancing ECM technology. The corporate records manager has to fully understand the incoming electronic recordkeeping capabilities of the new ECM software. That’s a lot to learn and master, usually in a short period of time. They must become 100% comfortable with the ECM technology itself by asking these questions: how does it work? How is metadata defined? How can I control the metadata values? How do I manage documents as records? What do I have to ask of the users? They have to get their retention schedule into the ECM. This is usually like fitting a round peg into a square hole. Most retention schedules are not structured suitably for modern ECM systems, and they need to be massaged at the very least, or completely restructured in some cases.

The RIM Professional has to master the ECM technology, then master the electronic recordkeeping capabilities, get the retention schedule re-shaped and loaded, then heavily influence the manner in which the ECM is configured and deployed in order for the recordkeeping to work properly. Usually, all of this has to happen on top of a 40-hour work week filled with everyday recordkeeping responsibilities. To put it mildly, this is a pretty big challenge for the best of us.

I also find that the effort and investment the ECM vendors put into the recordkeeping component is small relative to the effort put into the platform as a whole. I know of one ECM vendor who came perilously close to dropping recordkeeping altogether because “it wasn’t important enough”. In my opinion, the documentation, training, professional services, and aftersales support for the recordkeeping features often lag well behind the rest of the ECM platform. I have no doubt that this is merely a reflection of the priority of the recordkeeping features relative to the platform itself, but nonetheless there is a great deal of room for improvement in this area across all of the vendors. At the end of the day, it’s another element that makes it even more of an uphill climb for the RIM professional to come to grips with the technology.

It’s not the fault of the individual RIM professional – it’s the fault of the industry at large, including the vendors and the buyers. There is a massive educational gap that is getting worse, not better.

 

ECM Deployment Failures. Not all ECM deployments go well. I’ve not seen any two estimates of ECM deployment success that even define success the same way, let alone show any consistency of results or measures. Therefore it’s impossible to put a number to successful ECM deployments. But whatever that number is, it certainly is not close to 100%. Simply put, the ECM is the “vessel” electronic recordkeeping is riding in. If that vessel is unhealthy, and is rejected by end users, there’s nothing that recordkeeping can do to overcome it. The electronic recordkeeping capability of all ECM solutions need ECM to be widely adopted by all end users, and capturing not only the electronic records themselves (including email), but appropriate metadata that recordkeeping depends on. If this is not happening, it’s pretty much impossible to manage the documents as records, by applying proper control to them, classifying them against the retention schedule, and applying disposition to them at the end of their lifecycle.

 

The RM-IT Gap. Because recordkeeping is a feature of the larger ecosystem of the ECM, it cannot operate independently of the ECM. The records themselves are documents stored within, and fully under the control of, the ECM. The ECM must be configured to determine the metadata applied to the documents. The ECM determines and applies security permissions. And it specifies where documents are stored, by defining locations such as folders, libraries, or whatever nomenclature the particular ECM applies to these locations.

This means that in order for ECM recordkeeping features to work, the RIM professional has to work very closely with the IT team to configure the ECM itself. Usually however, IT is simply not comfortable with the idea of RIM people telling them how to configure ECM. Worse still, few RIM professionals are comfortable with configuring sophisticated ECM systems. They have to learn a whole new technology and skill from scratch. The few RIM professionals who tackled this with gusto were often rebuffed by large IT departments who were moving ahead with ECM configuration according their own blueprint, regardless of RIM. In too many cases, IT proceeded with ECM planning, configuration and deployment according to their own vision and approach, with little regard for the recordkeeping features. The RIM professional often lacked the mandate, the political clout, or the know-how to influence the ECM deployment to the extent necessary for a successful electronic recordkeeping deployment. I found the larger the organization, the larger the gap I would likely see. IT would blast ahead with ECM deployment, and the RIM professional would manage paper records, trying in vain to influence the ECM direction.

 

By 2012, many ECM vendors were starting to feel somewhat uncomfortable with the recordkeeping side of their projects. They perceived recordkeeping as complicated – a lot of trouble. Surely something simpler would do the trick?

By 2012 most ECM products were shipping with the ability to apply “Policies” to documents. A policy was a set of behaviours and characteristics that could be automatically applied to a document. A policy could be applied to a location (e.g. folder) such that all documents in that location automatically inherited the policy. Such policies specified how long the document was to be retained, when it would be destroyed, and any criteria that had to be met before the document was deleted. Sometimes these policies respected the retention schedule, but they did not always have to – it was possible to create all the policies you wanted, without any regard for the retention schedule.

This new capability became quite popular with most ECM products, because it was seen as a way that their customers could delete documents without having to bother with the trouble and overhead of records management, which more often than not was not going nowhere fast. Many organizations proceeded to use these simple policies that deleted documents based on IT-imposed criteria, or criteria that originated from the business units with no regard for the retention schedule. It was better than nothing, they reasoned. For me however this was a disturbing trend, more proof that the industry at large was not making any headway in deploying electronic recordkeeping.

To this day, many ECM vendors offer formal recordkeeping capability, but they also offer this generic “policy capability”. On more than one occasion I heard policies referred to by one vendor as “records light”, which made me cringe. Having spent my entire career building electronic recordkeeping technology, I was deeply disturbed with the realization that with few (if any) exceptions, there were no genuinely successful deployments of electronic recordkeeping that met my simple criteria. So I left IBM and formed a vendor neutral consultancy with a single mission – to help buyers deploy their electronic recordkeeping projects successfully.

While all this was going on Microsoft had entered the ECM fray with their SharePoint offering in 2001, which few established ECM vendors took seriously. Anyone in the software business however knows not to take Microsoft lightly (ask Michael Cowpland). By 2012 Microsoft had slowly but steadily eaten into the market share of all ECM vendors, from the bottom up. The established vendors like ECM, IBM, and OpenText had to pay attention now, as SharePoint was growing up and becoming a genuine market challenger. And like all their ECM competitors, Microsoft soon ran into this thing called “records management”.

 

Microsoft

Microsoft SharePoint’s penetration of the market has been a slow, inexorable march to a point today where it is a force to be reckoned with. Some figures I found widely disseminated on the web:

  • SharePoint is a $2B business within
  • Microsoft claims 20,000 new SharePoint users are added every day.
  • 80% of fortune 500 companies use SharePoint
  • Microsoft claims 66% penetration of the enterprise

The numbers above are years out of date (I could not find up to date numbers), but suffice to say SharePoint has a major presence in the modern enterprise. That means more and more SharePoint sites are going to – you guessed it

  • be storing documents that need to be managed as

I ended the previous stage at 2012, for a good reason that I’ll get to shortly. For the moment I need to momentarily take you back to 2006, when Microsoft’s adventure with recordkeeping begins.

In 2006, Microsoft was starting to deliver SharePoint deployments to increasingly large US corporations and government agencies. It was a viable ECM contender by then, taking market share away from the bottom end of the share held by IBM, FileNet, OpenText, and the others. Some of their most cherished customer accounts delivered Microsoft an ultimatum. Build in recordkeeping compliance, or we have to drop SharePoint for a competitor.

So to great fanfare, Microsoft announced recordkeeping capability in their new SharePoint 2007 release. Suddenly the web was flooded with countless articles from Microsoft, and from SharePoint experts of all sorts, telling the world how to manage records with SharePoint 2007. You could now create policies that would delete documents automatically, in accordance with a retention schedule. I undertook to dissect this new capability to evaluate it for myself, and for my clients. I was most distressed by what I found. I read everything I could find on the web. I flew to Microsoft’s head office in Redmond, Washington, and spoke to them about these features. I met most of the five- person SharePoint records management team members (one was from Montreal – a huge Habs fan). Great people, but I can tell you they had absolutely no background in records management. I even had the rare privilege to interview Microsoft’s chief SharePoint architect.

My conclusion – they blew it. There was no means of inputting or managing a proper retention schedule. There was no way to properly manage case records. And there was no disposition process. SharePoint simply went ahead and blew away the documents as soon as a time deadline was reached.

Microsoft launched a full-court press on records management. The massive constellation of SharePoint partners and experts were re-purposing and re-publishing Microsoft’s core messages about recordkeeping. To this day, I see thousands of web pages that tell us in excruciating detail how to manage records with SharePoint. I told Microsoft how they got it wrong. The response? Polite acknowledgement on how I was obviously wrong. They sent their corporate records manager out on the records management speaking circuit to tell the story of how they manage their own records in-house with SharePoint, and how the rest of the world can do the same.

Perhaps it was just me, but I was feeling very much like a pariah. I was a lone voice in the wilderness directly contradicting mighty Microsoft. As I found out, Microsoft has an awful lot of friends – their worldwide network of partners and dealers, all of whom were making money selling SharePoint services. And of course I was telling organizations that SharePoint did not manage records, while the firms were telling the same organizations that they could readily deliver electronic recordkeeping solutions with SharePoint. Microsoft backed them up every time. Who backed me up? Nobody of course. RIM professionals around the world were deeply skeptical of Microsoft’s claims, but it seems nobody could articulate exactly what it was that was wrong with SharePoint.

In 2009, Microsoft claimed they had heard the problems associated with SharePoint, and they announced, again with great fanfare, a slew of improvements to the recordkeeping feature set. Records in-Place! New Content Organizer Capabilities! New Records Center! Multi-stage retention policies!

The announcements hit most of the right points. By now some degree of skepticism had crept into the market about Microsoft’s ability to manage records. But again I went back to Microsoft to find out for myself what had changed. My conclusion – not enough. Nothing that would overcome the original (three) fatal shortcomings. By now I was getting somewhat tired of answering the same questions over and over again – what exactly is the problem with SharePoint? Few people were taking me seriously. Microsoft was still selling recordkeeping in SharePoint, and let’s just say I was not a fan favourite of the SharePoint partner world.

I’d had enough – something had to be done. My own reputation and credibility were starting to take a hit. So I wrote a detailed report that put the hard facts in writing. I knew I’d be challenged on every detail, so I researched every detail carefully, and validated everything with Microsoft. The report carefully stated the shortcomings, and detailed how to customize SharePoint to get it right. I asked Microsoft to review and verify the entire report for accuracy, which they did. They granted me permission to use a statement claiming that Microsoft had reviewed the report for factual accuracy. I offered the report to anyone who wanted it. ARMA International https://members.arma.org/eweb/home.aspx?site=ARMASTORE) published it as a book. It soon circulated around the world. What happened? Nothing. Which was the best possible outcome, in my view. It was never challenged. Over the years, and not just because of my book, more people came to realize that perhaps Microsoft really did get it wrong. Microsoft even warmed up to me somewhat – they invited me to join their Canada-wide roadshow promoting SharePoint. I accepted their offer to explain how to manage records with SharePoint. That lasted exactly one session. Microsoft very nicely dumped me for being “too negative” on the recordkeeping. Back in the doghouse I went….

It’s time now to bring you back to 2012, where this third stage begins. Some enlightened software vendors in the records management space were seeing a market opportunity emerging with Microsoft floundering with records management. A whole new market segment was born – recordkeeping add-in software for Microsoft SharePoint. As of the date of this report, there are four vendors delivering these solutions – two in the US, one in Australia, and one in Canada. One of the four brings in 5015.02 certification to Microsoft, and two of the other vendors have committed to the certification.

Now a SharePoint buyer could finally get genuine recordkeeping. Each of the four add-in vendors took a radically different approach to achieve recordkeeping with SharePoint, but they all got the job done, and done correctly, after a few false starts with a couple of the vendors. But what if Microsoft someday gets it right, and releases real recordkeeping as a native SharePoint feature? That would likely wipe out this nascent market of recordkeeping add- ins. In my opinion, I do not see that happening. Microsoft tells me that as long as there is a healthy market for recordkeeping add-ins, they have no business case for diving back into the records business again. They prefer to focus on the platform, and encourage an ecosystem of third party products and services to use that ecosystem to deliver customer solutions. As long as people buy SharePoint, they’re happy. So Microsoft is happy, and the RM add- in vendors are happy. Hopefully the SharePoint buyers will be happy too! From what I see, so far so good. One new RM add-in vendor joins the market every 2 years, and I look forward to every new vendor.

But this market segment of just four relatively small vendors is doing something radically different than what I’m seeing in the rest of the ECM market (the Non-Microsoft vendors). The pace of innovation is astonishing. These RM add-in vendors don’t have to worry about any of the plumbing or architecture of an ECM – they put 100% of their effort into managing records, while leaving the tough ECM stuff to Microsoft. They don’t have to worry about a repository, about creating metadata, or even about searching – that’s all done for them. They’re free to innovate new ways to apply recordkeeping to documents, without worrying about anything else. Within Microsoft’s ECM competitors, relatively few resources are applied to the recordkeeping capabilities. But these RM add-in vendors have formed entire development teams, marketing teams, support teams, all devoted 100% to recordkeeping.

And there’s another radical difference. Most of the legacy ECM products are what I refer to as location-based, in that many of the behaviours and characteristics of documents are determined by the location in which document is stored (which folder, library, etc.). That means that location matters, and users are constantly worrying about where something is stored. SharePoint turns that on its head – with a location-independent approach, where location does not matter. One of the most pervasive end user objections to recordkeeping I’ve encountered over the years is that users do not want records to dictate where they store their documents. Until now, most (not all) ECM solutions dictated where the user had to store documents in order for recordkeeping to happen correctly.

And this is where the magic has truly taken place. These four vendors have finally produced genuine recordkeeping automation. Rules-Based Recordkeeping (RBR) is a software capability that allows the RIM professional to fully automate the following:

  • Determine which documents are
  • Decide when to declare documents as
  • Classify against the retention schedule (correctly!).
  • When to move the record to a long-term

This is the breakthrough I’ve waiting for 30 years to see. For 30 years, we’ve been depending on users to identify which documents are records, and to classify them against a retention schedule they do not care about or understand. This dependency on the end user has been holding this technology back since day one. It appears to me that these days are finally over. To be fair to the non-SharePoint vendors, some of the traditional ECM products have incorporated some degree of this automation, but I’ve not seen anything close to the level of automation I’m seeing in the SharePoint market segment.

If it sounds as if I’m promoting SharePoint solutions over other ECM products, well again you’d be wrong. Traditional ECM vendors will always have a strong market position. Even Microsoft readily admits they cannot be all things to all people. I have clients with ECM requirements that could never be met by SharePoint, and I doubt ever will. One of my clients uses Documentum for millions of aircraft maintenance records and drawings – they are not going to switch to SharePoint in my lifetime! Many installations of mature non-Microsoft ECM platforms are now finding that SharePoint usage is “creeping into” their organizations. Some will inevitably switch to SharePoint. Some will ignore SharePoint and carry on happily with no issues. And some will try to connect the two so they “work together” – SharePoint for document creation, and the traditional ECM as a formal records repository. Technically possible but a bear to implement and manage.

All non-Microsoft ECM vendors have the means of delivering good recordkeeping within their solutions. It just takes more time and work to reach the goal. And there’s absolutely no reason why these ECM vendors cannot build the same powerful RBR automation capabilities into their products. I would certainly encourage them to do so.

 

Where Are We Now?

ECM systems that deliver electronic recordkeeping capability have become well known in the market these days as EDRMS (Electronic Document and Records Management System). I’ll use this terminology from this point forward.

There are now three distinctly different EDRMS technology “streams” in the market today. They are:

 

Traditional ECM           These are the large ECM vendors I’ve been referring to (IBM, OpenText, etc.). In order to get EDRMS capability you have to invest in a sophisticated ECM platform, then utilize the recordkeeping component. As of the date of this report, these vendors have not innovated to the same RBR level as those in the SharePoint stream, but that could change very quickly.

 

SharePoint                   Here you have to buy a third party add-in to augment SharePoint. That’s a downside – any IT professional will tell you that integrating separate products is never the best technology option. However you get a higher level of focus and innovation from the RM add-in technology. And the big bonus is RBR – you can automate most recordkeeping functions such that you no longer need to depend on the end users to meet your recordkeeping performance numbers.

 

Independents               There are a few (I can identify only three, perhaps four) of records-specialist software firms who are catering primarily to RIM professionals, and have extended their products to include ECM capabilities. These products are “records- forward”, or “records centric”. By this I mean these products are all about records, and are aimed at the recordkeeping market. I will include HP, with its HP RM offering, in this stream. Although the technology configuration for HP RM is the same as the other two in this stream, the company is quite different. HP is a global powerhouse, whereas the other two are niche market players, and their size reflects this.

 

All three have their place in the market, and will continue to ship solutions that work. Very rarely (if ever) have I seen an organization select an ECM technology based on any recordkeeping requirements. Most often, the IT department will select either SharePoint or a traditional ECM technology, and run with it. The recordkeeping options then fall out that particular technology choice. If they select OpenText for instance, they will have to use the recordkeeping options of OpenText. If they select SharePoint, they will have to select one of the four RM Add-in products to utilize.

I do not like what I’ve seen to date with solutions delivered within the Independent stream. I’ve seen some serious difficulties achieving end user adoption, particularly in small to medium deployments (fewer than 1,000 users). These products will likely fit best in corporate cultures that accept a strongly records-forward operating stance. I believe it will take more work to achieve end user adoption in this stream than with the other streams.

Remember the four problems I mentioned earlier that have been plaguing EDRMS projects to date? They’re still there. So what has changed? The technology has changed, particularly within the SharePoint stream. With RBR, the odds that you can overcome the barriers to success just went up – way up. If you suffer a low priority of RM versus ECM, it’s not quite as difficult to overcome. It’s not as difficult for the RIM Professional to skill up and become equipped for success. The other two factors really don’t change. The ECM itself is still challenging to successfully deploy. And if the RIM professional and IT are not working closely together, recordkeeping is still doomed.

 

Email

By my reckoning, emails comprise anywhere from 30-80% of an organization’s total digital records, all of which need to be controlled and managed properly as records. Put another way, there are often 3-5 times more emails than documents that meet the criteria of a business record, and these emails should (must) be managed as records. No organization can claim they are managing their electronic records unless they are applying records control to their email.

Technologically, email remains the Achilles heel of any modern EDRMS project. The problem is that most organizations use Microsoft’s Outlook/Exchange platform for email. This email platform has no recordkeeping capability whatsoever. It’s an island of massive volumes of stored information that’s disconnected from the ECM in every way. Even Microsoft’s SharePoint ECM platform is not connected to email. In the ECM stream, each ECM vendor has to write special integrations between their products and Exchange in order to provide their users an easy way to get their emails into the ECM where they can be managed as records. In the case of SharePoint, yet another third party product is required to integrate email with SharePoint. For any reasonable EDRMS solution in the SharePoint stream therefore, three separate technology acquisitions are required – SharePoint, the RM add-in, and an Email Integration product.

Even when email is tightly integrated with the ECM platform such that users have an easy way to get their emails into the ECM, the choice of emails to put into the ECM remains a voluntary end user decision. To a user, this means they have to decide which emails are important to the organization. Then, they have to go through the process of actually submitting the email to the ECM, which means they have to fill in mandatory metadata that identifies what the email is about. This adds up to end user effort, which once again puts us back into the dark years of the past where we depend on end user discretion and effort. And we all know how that has served us to date!

There are some (I can identify one for sure, and perhaps a second) innovative new technology solutions that actually use software artificial intelligence, or some semblance thereof, to read the email in the inbox, and determine if it is a record or not. It ranks all email by likelihood that it is a record, and how it should be classified. Does this technology work? Sort of. In some circumstances, with certain types of email (predictable, well-described email), it works very well. In other cases, it’s positively awful. Overall, it has great promise, but it’s a long way from everyday usability. I note also that the cost, and the technology overhead necessary to support this whiz-bang capability, will take your breath away. Therefore I believe this capability is only suitable for large-scale, well-funded projects with extremely deep resources.

 

The future of electronic recordkeeping

When I ponder the future of recordkeeping, I could readily try to predict plenty of technology trends – that’s always fun. I could predict that we’ll find ourselves pretty much entirely in the cloud someday. I will certainly predict that the ECM vendors will catch up with the SharePoint RM add-in vendors and deliver RBR. But these predictions are not going to help us. Besides, that’s not what I’m seeing in my crystal ball. All I see in the future is the one thing that obsesses me these days. That one thing? Education.

That’s what we need more of. Lots of it. All the great technology in the world won’t help us if we don’t know how to utilize it to meet our goals. We have plenty of technology, and that technology has now advanced to the point where we can largely automate the recordkeeping processes. In the past, the technology has been holding us back. Not anymore. Now, we’re holding ourselves back. We need to better equip RIM professionals with the knowledge and skills to understand ECM to the level where we can influence how ECM is configured for deployment. We need to understand this modern EDRMS software better, to learn how to automate using RBR. And we need to better understand EDRMS project management methods and techniques, such as defining key performance measures to ensure the health of the projects.

Not too long ago I suggested to a large, global records management firm that EDRMS projects were not very successful, and that a massive, large-scale education program was needed to turn the tide. The reaction? “Our customers are doing just fine – you don’t know what you’re talking about”.

We now have the technology to get the job done. Think of a modern passenger airplane sitting on the tarmac. It’s nothing but a useless, expensive piece of wonderful technology if we don’t have a pilot with the training and skills to fly it. In the EDRMS business today, there are far too many planes parked on the tarmac. So what’s holding us back?

 

Works Cited

1As of the date of this paper, RDIMS has since become known as GCDOCS.

From Chaos to Order – A Case Study in Restructuring A Shared Drive

 

By Anne Rathbone, CRM and Kris Boutilier

 

Synopsis

In 2011, the Sunshine Coast Regional District (SCRD), in British Columbia, embarked on a project to restructure the shared drive used by all its employees. The new shared drive went live for staff in March, 2012 and by December of the same year, all records were migrated to the new drive. This case study features the SCRD’s management of electronic documents, how the new drive was restructured and the lessons learned from the project.

 

Introduction

Incorporated in 1967, the SCRD is one of the smaller regional districts in British Columbia, encompassing almost 3800 square kilometres of land and providing services to more than 28,000 residents. There are about 250 employees spread over 14 sites and nearly 200 use computers in some fashion and therefore generate electronic documents.

Regional districts are similar to counties in other parts of Canada, providing municipal-style services to unincorporated areas as well as providing region-wide services to member municipalities. Local governments, including regional districts, are often segregated along business lines. The SCRD has 98 diverse and entirely distinct business units including:

  • Accounting, payroll, financial and investment services;
  • Legislative services and bylaw compliance;
  • Solid waste management;
  • Water supply and distribution;
  • Parks planning and management;
  • Four distinct fire departments;
  • Five separate recreation facilities; and
  • Transit

Each of the departments has very specific needs with regards to managing records and often these needs are driven by external legislation. As an example, our Fire Departments not only respond to fire and accident calls, they also need to be able to manage their radio licenses, leases for radio towers and they have acquired their own records management system (Fire RMS). This software is to specifically manage their emergency records and is maintained and operated by an external service provider. In addition to the four fire departments directly operated by the SCRD, there are also two independent fire departments – Sechelt and Pender Harbour. Although all six departments operate independently, the SCRD operates a service for all of them to provide 9-1-1 dispatch and to maintain the dispatch radio network, which is handled through third party contractors.

However, as we are legally one single organization, the information must be treated consistently with respect to handling, classification and retention, regardless of how or where it is created.

At the time the SCRD embarked on the shared drive restructure project, there was no system in place to manage electronic records. The details of the state of the SCRD’s electronic records will be provided later in this article.

The classification schedule from the Local Government Management Association (LGMA) of British Columbia’s Records Management Manual was being used for hard copy records, which are maintained in a central file room and an in-house inactive records centre. The structure of the schedule is such that it reinforces the concept that the SCRD is ultimately one corporation, even though it usually thinks in a compartmentalized manner. As an example, under the large function “Engineering and Public Works”, there is a primary 5280 Environmental Management and secondaries related to air quality, chemicals, noise control, and so forth. Retention is applied at the secondary level. See Figure 1.

History of SCRD’s Shared Drive

Before the SCRD started the shared drive restructure project, finding a document was like finding your way through a maze. The original shared drive, called “H:” was established in 1994 to take advantage of a new Novell Netware file server. There were about 35 computers all at one site and all running DOS. Creating the shared drive was intended to eliminate the “floppy disk shuffle”.

The H: drive structure was modeled around the approximately 22 major departments that existed at that time. The contents of the departments’ folders were intended to be transitory – conceptually, draft documents were prepared within a department’s private folder and the final edition eventually moved to a shared access folder. Shared folders were organized by year and originating department, much like how the SCRD organized its correspondence files at that time. Any sensitive files, such as Human Resources, remained on floppy disk and were stored under lock and key.

The hard copy was considered the ‘R’ecord and the electronic version was simply for retrieval convenience and to allow reusing well-formatted documents such as agendas and minutes (this occurred long before Microsoft Word Document Templates existed).

Over time, the SCRD added dBase databases and other structured data systems such as Accpac. Those systems also stored their data on the file server where it was reliably backed up and protected against corruption.

There was a file name limitation of eight characters, with a three character extension. DOS did not require the three character extension to reflect the application used to create the file.

Therefore, when working with WordPerfect documents, it was often used to store the initials of the document creator.

The H: drive grew and evolved resulting in:

  1. ‘Walled gardens’, where departments could and did do anything they
  2. Folder structures that ‘evolved,’ theoretically to match a department’s

As technology changed, and with the elimination of the restrictions in file name length, users found they could create subfolders and include information in the subfolder title that more correctly belonged in the file name itself, such as creating “Year” folders instead of putting the year directly in the file name.

In 1998, the SCRD experienced a sudden and dramatic corporate restructure. The H: drive no longer substantially followed the SCRD’s organizational structure. With this change, and with ongoing growth in the size of the organization, there was an increasing need for collaboration across the departments. To handle this need, a set of “teams” folders were created, with explicit permissions granted to specific users collaborating on select projects. The H: drive and the SCRD’s organization structure continued to diverge as new services and departments were added without modifying H: drive.

Most importantly, permissions were modeled around the “named user”. Complications arose when a user left the organization or when collaboration on a project transferred to another employee.

By 2012, the H: drive contained over 465,000 files in over 40,000 folders. Due to the volume of files:

  • Duplication was rampant – some staff copied a document to where they thought they would be sure to find it again; and
  • Disappearances were frequent – other staff would find a document and then move it to where they thought it belonged, without advising the author of the

Since no rules had been developed to explain how records were to be saved, there was simply no way to train new staff on how the shared drive worked, what was to be saved there, or how it was to be saved.

Because of the walled gardens and lack of naming conventions, it was extremely difficult to find an electronic version of a document when the corresponding hard copy was destroyed. This made it impossible to apply the SCRD’s retention schedule to electronic documents.

As a result, the SCRD faced numerous issues:

  • A lack of confidence that the searches for electronic documents were sufficiently complete for either Freedom of Information requests or litigation and claims;
  • Concern there may be violations of the Freedom of Information and Protection of Privacy Act of BC (FOIPP) – some personal information collected in the course of business was stored in open folders;
  • Information leakage – some personnel documents and in-camera minutes were stored in open folders, permitting access beyond the required staff;
  • Zombie documents proliferated – if documents otherwise eligible for deletion were not saved in the appropriate folder they could linger forever, shambling back out into the light at exactly the wrong time;
  • Lost files, including the SCRD’s bylaws;
  • Volume of password protected documents grew and effectively became invisible to searches;
  • Collaboration between departments was difficult – resulting in documents being emailed between staff, which were then named and saved in different folders, amplifying the duplication problem; and
  • Proliferation of subfolders with a user’s name (for example, Jane’s Stuff).

Due to the lack of controls on subfolder creation, the H: drive had evolved unbelievably deep folder structures. SCRD staff lost files because the length of the file names, which included the file path, was so long.

All minutes and agendas were stored in a central folder, and correspondence was organized by year and month, then by originating department. So, staff had to remember the month they wrote a letter in order to find it again – and by the time the restructure project was started, there were almost 230 months from which to choose.

Figure 2 is a circular treemap, which is a graphical model that attempts to summarize the overall structure of a shared drive. The central back dot is the top level of the drive and moving outwards each ring represents a level of folders, so folders-within-folders quickly build out as ‘spikes’. The overall size of a spike indicates the depth of a subfolder tree needed to reach that folder. Figure 2 captures all 40,405 nested folders in H: drive, where every branch of folders has been growing independently of all the others. Particularly evident is the lack of symmetry, including seemingly random excursions as far as 14 folders deep. Every asymmetric aspect is an exception and has some unique reason for existing that must be contemplated when navigating – yet that reason is likely only known to the person who created the subfolders. It is also worth noting that records could be spread out across any levels of folders, anywhere, not just in the final folder of a particular path.

 

 

Figure 3 extracts the 8500 folders in the Infrastructure Department walled garden (the portion of the overall graphic outlined by the red wedge). The tiny black line on both scales of the treemap identifies an example folder at the end of the deep spike:\Infrastructure\Sustainability\1425-20 – Public Education Program\Admin\Email Saved\Recycling Feedback\General\For Curbside Pick Up\Roberts Creek

 

  

Budget Impacts

In 2003, consultants were hired to evaluate the SCRD’s requirements for an electronic document and records management system (EDRMS), which led to the preparation of a draft Request for Proposals. The consultants also included design recommendations for the new administration office building that was under construction. While the design recommendations were incorporated, the EDRMS funding was frozen by senior management as it was felt the project represented too much change for staff when combined with the move to the new building.

In 2007, a budget was put in place for a full-time Records Management Technician, with the goal of achieving two major projects: to inject the LGMA classification schedule into the H: drive and to implement an EDRMS. In 2008, a budget submission for an EDRMS was again submitted, but was not approved. And, in 2011, the SCRD Board of Directors approved $10,000 to improve the structure of the H: drive.

 

Project Structure

With the SCRD Board of Directors’ approval of funding, the restructure project was initiated. Two committees were struck: the steering committee and the working committee.

The steering committee consisted of the Corporate Officer and her Deputy, the Records Management Technician, the Manager of Information Technology (IT) Services and two of her staff. The Corporate Officer and IT Manager were tasked with communicating to SCRD management critical aspects of the project, such as:

  • Why the project was needed;
  • Timelines;
  • Work each department would need to accomplish;
  • Budget and staff

The steering committee quickly determined that the H: drive could not be fixed and that a new shared drive was required – the N: drive.

It was critical that the steering committee understood each other’s perspective in order to accomplish the goals of the restructure project. From an IT perspective, with storage becoming ever cheaper, Records can be kept forever as capacity is not an issue, search technology is improving very quickly, and there are innovations in computational knowledge extraction.

Records and Information best practice requires that Records be managed from creation to final disposition – just because we can maintain everything forever does not mean we should.

The working committee consisted of at least one representative from each department – the representative would be that department’s advocate during the project. In addition to bringing their department’s needs and wants for the shared drive forward to the committee, the representatives communicated to their respective departments the reasons why the project was needed and how the SCRD would accomplish the change. As well, the representatives were to communicate specific processes each department was required to accomplish such as:

  • Reviewing files in their H: drive folder and deleting what they could; and
  • Determining the subfolders the department would need in the new shared drive so the drive could be seeded before

A price request, project outline and desired outcomes were sent to several consultants. A request for proposals was not needed as the project budget of $10,000 was below the threshold set by the SCRD’s Purchasing Policy.

A consultant was hired in September, 2011. By November, 2011, on-site work had begun.

The on-site work included folder structure revision and enhancement, as well as interviews with staff to solicit specific knowledge of each department’s requirements. The original timeline was to implement the new shared drive structure on January 1, 2012; due to staff and consultant workloads, cut-over was achieved March 1, 2012.

The steering committee decided it was important to set a target date when the H: drive would disappear and felt six months would be sufficient – August 31, 2012. The H: drive would then be completely deleted by December 31, 2012.

The desired outcomes for the project were:

  • Resolve internal collaboration issues;
  • Implicitly purge electronic files that were beyond retention;
  • Reduce information duplication;
  • Improve the quality of search results;
  • Provide certainty for Freedom of Information and litigation searches;
  • Preparatory work for the eventual import of electronic files into a formal EDRMS; and
  • Rudimentary ‘knowledge capture’ from senior employees nearing

The project outline seemed simple:

  • Create a new folder framework;
  • Determine lower level folder structures;
  • Create a draft design of the folder structure;
  • Determine and document permissions;
  • Set up the new folders and infrastructure;
  • Train staff;
  • Migrate to the new folder structure; and
  • Delete the old

However, the scope of the project was quite large as it included all electronic records and email, as well as providing the ability to restrict access to confidential records.

In creating the folder framework, the SCRD required the consultant to design a high-level folder structure that would be universal across all departments. The consultant met with the steering committee to gather the requirements for the folder structure.

Part of the project outline was the requirement that the consultant help the steering committee adapt the LGMA schedule for use with the SCRD’s electronic documents. Using the same schedule for both paper and electronic would reinforce for users the concept that the information was the same, only the media was different.

 

Folder Structure

The initial high-level folder structure used the 16 functional folders from the LGMA schedule:

  1. Administration
  2. Assets and Procurement
  3. Buildings, Facilities and Properties
  4. Community Services
  5. Finance
  6. Human Resources
  7. Information Systems and Service
  8. Infrastructure
  9. Land Administration
  10. Legal Matters
  11. Legislative and Regulatory Affairs
  12. Parks Administration
  13. Planning and Development
  14. Protective Services
  15. Recreation and Culture
  16. Transportation and Transit Service

The lower level folder structure would be specific to each department. To accomplish this, the consultant conducted a file and document analysis and interviewed staff in each department.

In the LGMA classification schedule, each primary has a “general” secondary. Wherever possible, this was eliminated to encourage staff to file in an appropriate secondary without a default option.

An absolute limitation on tree depth was imposed – function, primary, secondary and optional folder. Figure 4 shows the folder structure for grants received from organizations.

 

While staff had access to and could create files within any pre-existing secondary or optional folders they could see, any modification, addition or deletion to the folder structure would be made by the Records Management Team. The team consisted of the Record Management Technician, Corporate Officer and Deputy Corporate Officer. The ability to create subfolders was removed to ensure that the N: drive would not have the proliferation of subfolders that the H: drive had. There were some exceptions when the volume of subfolders required daily was high – such as building, zoning and development permits. Subfolder creation requests were sent to the Records Management Helpdesk and the final decision rested with the Records Management Team.

As previously mentioned, some modifications of lower level folders were required to accommodate departmental needs. Figure 5 shows the modifications necessary for zoning applications – very complex items that require several subfolders. Using the traditional LGMA structure, subfolders would not be possible as that would violate the rule “four levels of folders”.

Figure 6 shows the modifications required to accommodate the segregation of operations, which allowed subfolders for each waste water treatment plant. In this case, while each waste water treatment plant operates independently, each operation is essentially the same. Therefore, each secondary has the same subfolders.

 

Project Issues

Email

A vast amount of business knowledge is contained in email and users tend to hoard them, in keeping with the adage, “If in doubt, keep it.” This creates problems when there is a conflict between policies and corporate needs. There is also the perception that emails are ‘different’ from other electronic documents. And, classifying email is hard – there can be a rich variety of content in a chain of messages, making it difficult to identify a principal classification.

The SCRD has regularly conducted training for users on specific rules for how and who should be saving emails:

  • The originator of an internal email would save it in the N: drive. Recipients would delete the email;
  • Recipients of external email would save the incoming email in the N: drive. If there was more than one recipient within the SCRD, the first person named would save it;
  • All emails would be saved in the Outlook or .msg format to preserve headers and metadata;
  • Inboxes had a 500 Mb limit applied to encourage saving in the N: drive. If the limit was exceeded, emails could be received but not

Unfortunately, this training and the specific rules have produced low quality results when there are large numbers of emails to move into the shared drive. Without automation, it will be difficult to improve.

Originally the steering committee decided that when sending emails internally, hyperlinks were to be used and any attachments would be stripped. This was to decrease the amount of file duplication as the attachment lingered in the sender’s “Sent Items” as well as the recipient’s “Inbox” and possibly already existed in the N: drive. A Microsoft Exchange transport rule can easily be built to block attachments. It was also determined that many system functions operate via attachments, such as Calendar appointments and vCards (virtual business cards in Outlook), so the transport rule was built with a size threshold. The result: the Exchange transport rule was considerably more complex than originally anticipated.

Discussion with the working committee determined that attachments were necessary for consistency if an email was sent internally and externally. This often occurs when distributing agenda packages to SCRD committees as well as when sending certain types of information. Therefore, the rule to strip attachments for internal emails was relaxed. The staff were trained to include a hyperlink, but if there was an attachment it was allowed to go through. The sender received an automated warning and a copy of the email was placed in an inspection folder for later review to measure compliance.

NTFS Permissions

Microsoft New Technology File System, (NTFS) was introduced in 1993. It is the suite of mechanisms by which Microsoft ‘Server’ operating systems, and now all versions of Windows, store and retrieve persistent data, such as files on disk. NTFS includes an extremely flexible permissions or ‘access control list’ framework that allows for almost any conceivable arrangement of access and change control.

However, their complexity massively increases the risk of unintended consequences, especially when maintaining them manually. Mistakes in permissions create holes; users will discover them and probably will not mention what they have discovered. Essentially, the user may believe, “The system didn’t prevent me, therefore it must be permitted”. It is necessary to ensure permissions are structured so they can be applied consistently:

  1. Do not allow ad-hoc exceptions, if at all
  2. Make the permissions predictable and pattern
  3. Use tools such as Somarsoft DumpACL, Hyena, or Powershell to audit the file system permissions after they have been established, if building the permissions
  4. Leverage inheritance (see Figure 7).

Restricted Folder and Files

Determining how to handle the SCRD’s confidential documents required long discussions. Human Resources’ internal records had to be restricted to Human Resources, but how could access be restricted to personnel documents that managers create? Should the manager of one department be able to see the personnel documents relating to another department? What about supervisors? How could litigation, claim and accident files be efficiently managed when access would depend on who was involved?

Ultimately “restricted” functions, which paralleled the unrestricted functions, were created. Following permission management best practices, access to those “restricted” folders was explicitly granted to job roles, rather than named users, to reduce ongoing maintenance risks.

It was made clear to users that just because they did not want anyone to “see” a document, it did not make the document confidential. FOIPP was used as the guide to identify what was confidential.

This structure had the added advantage that the Microsoft Search indexer could reliably exclude all the restricted files and a second, separate index could be created that only includes them.

Figure 8 shows the structure in Legal Matters (Restricted) function; only the Chief Administrative Officer (CAO) and the Corporate Officer have top level permissions. Access to the subfolders would depend on who is involved and permissions would be set on a case by case basis. For example, the Transit Supervisor has access to the Transit accidents but not the Fleet accidents.

Understanding and leveraging NTFS ‘permission inheritance’ is crucial – defining them at the right point in the folder tree minimises maintenance effort and risk of errors.

 

 

Final Top-Level Folder

With the restricted folders, the final result was 24 top-level folders:

  1. Administration
  2. Administration (Restricted).
  3. Assets and Procurement
  4. Assets and Procurement (Restricted).
  5. Buildings, Facilities and Properties
  6. Community Service
  7. Finance
  8. Finance (Restricted.)
  9. Human Resources
  10. Human Resources (Restricted).
  11. Information Systems and Service
  12. Infrastructure
  13. Land Administration
  14. Legal Matters
  15. Legislative and Regulatory Affairs
  16. Legislative and Regulatory Affairs (Restricted).
  17. Parks Administration.
  18. Planning and Development
  19. Protective Service
  20. Protective Services (Restricted).
  21. Recreation and Culture
  22. Recreation and Culture (Restricted).
  23. Transportation and Transit Service
  24. Transportation and Transit Services (Restricted).

 

Training

Prior to the cut-over to the N: drive, all staff were required to attend Managing Information in N: Drive Training (MINT) – and were appropriately rewarded with mints, chocolate mint cookies and mint tea! These rewards were enthusiastically received by staff and similar marketing tools have been used effectively for other RIM training.

Support from senior management for this training was crucial; without attending MINT staff would be lost once the N: drive went live. Once staff attended the training, IT provided them with read-only access to the N: drive so they could explore and become familiar with it.

There was some push-back with a few staff members and certain managers. Some of the concerns were legitimate, such as lost productivity, or that there was no budget to bring in relief staff. However, some staff simply refused to do the training because they did not see any value in the project. To ensure the success of the project, it was decided that training was mandatory and those who continued to refuse training would not be using a computer.

Training was done by department, which allowed the training to focus on the department’s specific questions.

To accommodate all staff:

  • Training was scheduled over a six week period and training space was dedicated to the project;
  • Presentations and discussions were used, as learning the concepts was the focus, not how to use Windows;
  • Staff from remote sites came in to the main office; and
  • Staff working outside the main office hours were specially scheduled to be able to attend.

The training included:

  • A basic overview of records management and legislation requiring the management of SCRD records;
  • Why the shared drive needed to be cleaned up;
  • New rules for the new drive;
  • Requirement to use hyperlinks instead of attachments when sending internal emails;
  • Naming conventions;
  • An overview of the classification schedule;
  • How to classify their documents; and
  • Instruction on basic Windows concepts (hyperlinks, shortcuts and searching).

All attendees received a cheat sheet to help them find where to save files and a list of acronyms and abbreviations commonly used by the SCRD. The training, cheat sheet and acronyms were also available on our intranet.

 

Putting it Into Practice

Pre-cutover Activities

Because the H: drive had never been purged, departments were encouraged to review existing files and clear out “the garbage”. Several departments were enthusiastic about the change and some even rearranged their folders on the H: drive to reflect the new structure. Some departments dedicated specific staff to purging and preparing to move files, some said everyone would be responsible for their own files, and some hoped the whole thing would just go away!

After training, any time an attachment was emailed internally, the sender received an automated warning message.

Prior to this project, IT had been using conventional ‘roaming profiles’, whereby a user’s personal files (My Documents etc.) were copied down from a central server to the desktop they were logging on to, and back at the end of the session. This was upgraded to ‘folder redirection’, whereby user’s files were always saved directly to a central server. This allowed implementing file system quotas on personal folders, as a pre-emptive strike on hoarding.

 

Point of No Return and Go-live Pain

At 11:50 pm, February 29, 2012, the H: drive permissions for all users were reset to Read and Delete. At 12:01 am, March 1, 2012, after a very large deep breath, the Read-only flag was removed from the N: drive.

All users were now able to create/read/write files on the N: drive. All users were also able to read/delete anything on the H: drive, with the exception of Human Resources, certain databases, and the “Teams” folders.

On the first day after cutover, the Records Management Helpdesk had over 250 requests, not all with a professional tone. As the SCRD only has 250 employees, the number of requests the first day was, essentially, a one to one ratio to the number of staff. Those staff that had not yet taken the training couldn’t understand why they could not create subfolders anymore. People had trouble finding files and several demanded that the H: drive be re-instated. The Corporate Officer, in response to the first such demand, suggested that, as it had only been 45 minutes, they might want to wait a little longer. In addition to staff’s frustrations, we found there were some permission errors.

During the first week, special ‘one on one’ training was done with staff who had not attended MINT. As they were already frustrated, they were not very receptive to the changes. However, this special training did contribute to the decline in the number for helpdesk requests. By the end of the first week, the number had dropped to about 75 per day.

Many users had problems conceptualizing that they did not need subfolders. The steering committee felt that if files were named correctly staff could use Windows Explorer to find what was needed.

Many departments had not provided a list of required subfolders, such as the need for vendor names in several secondaries. Had the list been provided prior to cut-over, the subfolders could have been seeded. After cut-over, the Records Management Technician had to create the folders individually and then set permissions on each folder. In Restricted folders setting permissions could take hours or days.

 

Resistance – Futile but Frustrating

There were some staff who continued to resist the implementation believing that their position/job duties allowed them special privileges – such as being able to have more than four levels in a folder tree or being able to save to their C: drive. In these instances the parties and their managers were reminded of the new policies and procedures.

Other staff began using My Documents for storing their files. To discourage this practice, a 500 Mb soft limit was implemented. IT ran regular reports to find the offenders.

  • Over 400 Mb – the offender was given a warning email, cc to RIM;
  • Over 490 Mb – another warning email was sent, cc to manager and RIM;
  • Over 500 Mb – RIM discussed the problem with the

There were some staff that consistently exceeded the 500 Mb limit, despite having been spoken to by their managers. In those cases, the violator’s account was disabled until the user met with RIM and IT to: 1) determine the ongoing problem, and 2) provide extra training to the violator.

 

Demise of H: Drive

As the migration of files to the N: drive and the purge of the H: drive progressed, staff were kept informed of the progress via the intranet. This was to try and help them see the “big picture” as time passed. Figure 9 is an example of the update provided to staff.

No statistics were kept as to the number of records deleted from H: drive vs. those moved into N: drive. Approximately 98% of the folder growth in N: drive was due to the migration from H: drive and only 2% could be attributed to new folder creation.

As the date for the demise of the H: drive approached, staff tried to circumvent the system and they were pretty creative about it. As the Corporate Officer stated, “Never underestimate the creativity of those who wish to circumvent the system.” Instead of moving required documents into the N: drive, some uploaded them to an FTP site or moved them to their C drive; and some even moved them to the staff photos folder on the intranet. Unfortunately, sometimes documents were lost before IT or RM became aware of the attempt at circumvention.

Some staff also started saving documents to USB keys, CD’s and removable hard drives. It was made clear that this was unacceptable. Senior managers were very supportive in making their staff move the contents into the N: drive. However, if they did not know it was happening, it was difficult to stop.

Staffing became an issue as well. Some managers advised that their staff were too busy to deal with the H: drive and there was no budget to bring in extra staff to assist with moving the files.

These were valid issues, but they were the same ones faced by all departments. Proactive departments assigned staff to work on the move over a period of time.

As the deadline approached, managers were given some options to assist with the staffing issue:

  • Assign the task of moving the documents to a staff member and arrange one on one attention with the Records Management Technician; or
  • Ask the Records Management Technician to do the work – but it would be outside her normal hours and the department would be responsible for paying the

No manager chose either of these options.

There were people who simply refused to recognize there was an issue. In one department, the manager advised that all the necessary documents had been moved and what remained could be deleted. When the steering committee checked with department staff, they were advised there was a huge volume of documents that still needed to be reviewed.

Originally, the timeline to unplug the H: drive was six months after implementation of the new shared drive. After two extensions November 30, 2012 was the date that the H: drive would be disabled. A week prior to the deadline, a further extension was requested. As the options for helping to move files had not been used, the CAO decided the deadline would stand.

It was assumed that by the November 30th date the H: drive would be empty – all necessary files would have been migrated to the N: drive and any ROT (redundant, obsolete or transitory) would have been deleted. This was a completely erroneous assumption. On December 1, 2012, a review of the H: drive showed some important records remaining, including some critical to ongoing litigation. The project team took on the responsibility of moving those documents to the N: drive.

 

Surprises After H: Drive Disappeared

There were some unpleasant surprises after the H: drive disappeared:

  • While the volume of requests to the Records Management Helpdesk diminished dramatically, it did not dissipate. Subfolders still needed to be created on a regular basis, especially new case files. That demand continued to put pressure on RIM as it was added to the already long list of regular duties;
  • Because some staff were still not well versed in Windows and some staff still thought they were “special”, the time invested with those individuals remained high;
  • Requests continued to be made to search the H: drive for specific files. Unless it was related to litigation or some other pressing issue, those requests were denied;
  • Some staff continued to resist the N: drive, preferring to use removable mass storage. When discovered, sanctions were taken to rectify the situation. However, short of disabling everyone’s ability to use removable devices, no solution was found to eliminate this

 

Lessons Learned

This was the first corporate-wide RIM and IT project. As such, there were many lessons learned.

Marketing/Change Management

There needed to be more comprehensive project marketing to senior management, managers and staff. All three levels needed to understand the reasons behind the project, the benefits to them, what the project’s goals were and how those goals would be achieved.

In addition, for most people, a uniform file and classification system is an abstract concept. More education on why the structure was being imposed as well as more of a focus on how the structure worked would have eased much of the users’ frustrations.

 

Senior Management Buy-In

The degree of buy-in at the senior management level varied considerably based on the individual’s past experience. If they did not see the value, there was no reason to motivate their staff to do the work. In addition, the makeup of the senior management team changed during the project.

Those departments that had functional and well protected “walled gardens” did not see the need for change. However, for those who were on the outside of those “walled gardens”, the need for change was self-evident.

Withdrawal of support part way through would have sunk the project. Being more aware of those individuals who weren’t supportive of the project would have helped. An absolute necessity was the support of the CAO to back the project when complaints were received about the extra work being required.

 

Time and Budget Allocation

Time allocations needed to be a priority – up-front and visible dedication, not “borrowed” from other existing allocations. Staff with domain expertise needed to be assigned to work on the project and their day job transferred elsewhere. Unless the staff member supported the project, or they were dedicated to it, working off the side of their desk automatically made the project a low priority task.

Staff costs and downtime should have been budgeted. Managers were expected to draw on funding that had already been defended to the SCRD’s Board of Directors on the basis of other work commitments. This project was outside the scope of each department’s Human Resources plan. There was a misconception by a few managers that, in essence, this project ‘stole’ money from their budgets without their active participation and in doing so, led to frustrations and resentment.

 

Departmental Representatives

The representatives on the working committee needed to have been chosen more carefully. Most managers appointed one of their clerical staff. In some cases that was the best decision, but in other cases, their clerical staff were: a) not sufficiently computer literate, b) did not fully understand the implications to their department, or c) were not aware of all the information their department needed or produced. This led to the creation of folder structures and classifications that did not meet the department’s needs and caused frustration for their users; the steering committee should have sought approval from each manager prior to finalizing the folder structure.

 

Lack of Familiarity with Windows

Most staff used Windows tools on a daily basis but were unfamiliar with much of its functionality. Concepts such as shortcuts, hyperlinks, searching and ‘scopes’ were unfamiliar to a large percentage of staff, which was surprising to the steering committee. Therefore, the N: drive training was expanded to include some of these concepts and staff were encouraged to take the Windows course offered through the SCRD’s corporate training program.

 

Consistency with Similar Records

Different departments managed the same records differently:

  • Human Resources treated criminal record checks as part of an employee file, with FOIPP obligations satisfied;
  • Checks on volunteers were managed by individual departments – some recorded the check as being satisfactory, and then destroyed the document. Others retained the original document;
  • Some volunteers were also employees – the same document would exist twice, and would be managed inconsistently based on the ‘hat’ the individual was wearing at the time the check was performed.

These differences still need to be explored and processes put in place so the same type of document is treated consistently throughout the organization.

 

Restricted Folders – Fine Grained Access and Folder Traversal

FOIPP requires that confidential information protected by the act must not be disclosed. Every claim, accident and litigation potentially involves a unique set of staff and the resulting confidential information that must be protected in order to comply with FOIPP. Therefore permissions on restricted folders needed to be assigned on a case by case basis. Certain permissions were implicit – for example, the Purchasing Officer would have access to bids in progress. However, the vast majority of restricted folders have unique and distinct permissions. The N: drive model required an inordinate volume of folders to implement the appropriate segregation of permissions.

Folder traversal, or clicking through subfolders, in restricted folders was disabled to prevent information leakage via folder names. This prevented ‘browsing’ into permitted folders.

Therefore, unless a user had their permissions set at the top most branch of a folder tree, they were unable to navigate to the restricted folder they needed. The user had to be provided a shortcut which allowed them to ‘jump’ to the restricted folder. Creating the shortcuts rested with the Records Management Technician and was time consuming.

 

Ongoing Training

Although it was made very clear in the training that ‘Records Belong to the Corporation’, and this position continued to be reiterated with staff, many still struggled with the concept. Some staff didn’t feel the need to follow SCRD naming conventions and some continued to think there was no problem with copying and/or emailing documents for disclosure outside the organization. There was still some confusion as to what constituted a record vs. a transitory document.

Education and training was critical. Every new staff member who would be accessing a computer attended N: drive training (MINT), usually in the first week of their employment. They only had read-access until the training was complete. All managers and supervisors accepted this as part of the on-boarding process. However, there was a need to work with Human Resources to coordinate new hire start dates so training sessions consisted of more than one person at a time.

The MINT sessions for new staff were expanded to include training on protecting privacy, phishing, and managing hard copy records.

There were funds budgeted for corporate-wide records management training to continue to build on the concepts that had been already introduced to all staff. It was important that staff hear the message from a fresh perspective – an outside expert.

 

Locking Out the Locksmith

For all practical purposes, IT staff cannot be kept out of ‘restricted’ file locations. In essence, one cannot lock out the locksmith. This came as quite a surprise to many managers, and it was a point that needed to be driven home.

RIM staff are responsible for managing all structure and performing purges of files, but are not supposed to be aware of the contents (or, theoretically, the existence) of ‘restricted’ files. IT are the administrators of all software, so have access to virtually everything contained within the software.

While RIM and IT staff at the SCRD are union members, essentially they have access to personnel records. As a result, IT and RIM staff have both an ethical and legal responsibility to protect personal and privileged information. It is worth investing in tools and architecture to demonstrate the separation of roles has been maintained:

  • Create distinct ‘Directory Maintenance’ users with full access to all folders, but require RIM staff to log in as a separate Directory Maintenance user to perform structural modifications and purges; and
  • Deploy file access auditing software such as Netwrix or Varonis, to track Create/Read/Update/Delete events on restricted folders by any user, including the system administrators.

These proactive steps will help to answer questions about activity by any user or involving any file, not just demonstrate the integrity of the locksmiths.

 

Successes

Despite the challenges, the restructure project had many successes. Although the SCRD did not maintain statistics on duplication and version forks, they were noticeably reduced. For the most part, there was staff buy-in, and in discussions with staff, they found the new structure to be much more intuitive, efficient and easier to navigate.

Switching from assigning specific permissions to named users to assigning users to roles, and granting permissions to those roles, has dramatically improved the ease of user account maintenance for IT. By changing how permissions were assigned, it:

  • Automatically ensured consistent access to information for users who shared the same job function;
  • Created a clearer explanation for why a user might have privileged access; and
  • Reduced the need for both IT and the user to know in advance exactly what they might ultimately need access

Also, by having default-deny restricted folders, managers had to follow proper procedures for managing that access – employees could not be quietly moved to a different job as they likely would not have the access they needed. Once a staff member was moved to a new position, the access associated with their old job fell away. As an example, one of three Infrastructure Services Secretaries could post into the Transit Dispatcher position. Their previous access to records such as in-camera minutes and agendas would fall away, and FOIPP-protected Transit customer information would become accessible, all with two clicks.

The individual file count has shrunk from 460,000 to 202,000 files, with 80% of the reduction occurring in files that had not been modified in more than seven years.

By far, the two biggest successes this project has realized are:

  1. The retention schedule could be applied to electronic
  2. There was a uniform, logical, predictable folder

Below is a representation of the folder structure prior to and after restructure. Note that the “spikiness” has been eliminated and the new structure is consistent and predictable – now never more than four nested folders need to be contemplated when navigating, and no records or individual files will ever be encountered until the 3rd or 4th level.

Next Steps

In July, 2014, the SCRD put out an RFP for an EDRMS. In discussion with the various vendors who bid on the RFP, they agreed the work done on the shared drive would ease the transition to an EDRMS as it was an importable structure.

The lessons learned from the restructure project were incorporated into the implementation plan for the EDRMS and the SCRD went live with “Dr. Know” on May 19, 2015.

 

Conclusion

The SCRD’s restructure project was a great success, moving the organization from chaos to order. We will continue to build on this success, evolving our RIM practices, workflows and training as technology changes.

 

De la promesse à l’accomplissement Les 30 ans d’histoire de la technologie de tenue des enregistrements électroniques

Par Bruce Miller

Voici l’histoire vue de l’intérieur du développement de la technologie de tenue des enregistrements électroniques1, de ses débuts à ses fonctionnalités et à son statut actuels. Le présent document n’est pas qu’une simple leçon d’histoire, mais plutôt l’histoire racontée de manière objective et concrète des succès et des échecs de cette technologie et du chemin parcouru. Je ferai de mon mieux pour prédire la voie que nous devons suivre pour l’avenir, selon la situation actuelle. Merci de garder à l’esprit que j’expose souvent dans ce texte mes opinions, qui ne doivent pas être interprétées comme des faits. Je vous souhaite bonne lecture!
– Bruce Miller

L’histoire du développement de la technologie de tenue des enregistrements électroniques jusqu’à ce jour peut à mon avis être divisée en trois périodes ou stades distincts :

 

Logiciels de tenue des – De 1983 à 2002. Cette période marque la genèse des logiciels de tenue des enregistrements électroniques, les débuts difficiles d’une nouvelle technologie et l’apparition d’une nouvelle norme logicielle américaine qui a transformé à jamais le marché.

enregistrements électroniques – Des enregistrements électroniques, les débuts difficiles d’une nouvelle technologie et l’apparition d’une nouvelle norme logicielle américaine qui a transformé à jamais le marché.

Gestion de documents  – De 2002 à 2012. Au cours de cette période, les principales entreprises de logiciels de gestion de documents se sont mises à incorporer des fonctionnalités de tenue des enregistrements, transformant de ce fait non seulement la technologie, mais également la façon même dont les solutions arrivent sur le marché.

Microsoft  – De 2012 à aujourd’hui. En raison de l’omniprésence à l’échelle du globe de Microsoft dans nos systèmes de gestion, son influence est à elle seule, et c’est le moins qu’on puisse dire, démesurée. La plateforme de gestion de contenu d’entreprise (ECM) SharePoint de Microsoft a commencé à gagner de plus en plus de terrain sur les fournisseurs de plateformes de gestion de contenu d’entreprise établis. C’est au cours de cette période que Microsoft a finalement compris la nécessité d’offrir des fonctionnalités de tenue des enregistrements, à l’instar des acteurs bien implantés du domaine des plateformes de gestion de contenu d’entreprise. Microsoft a fait de la fonction de tenue des enregistrements l’un des fondements de son logiciel SharePoint, malgré le fait que les logiciels concurrents étaient déjà dotés de fonctionnalités de tenue des enregistrements bien établies. Pour le meilleur ou pour le pire, cette démarche a entraîné des changements majeurs dans la technologie des logiciels de tenue des enregistrements et a finalement donné lieu à une innovation sans précédent qui a permis de franchir un obstacle majeur qui nous freinait dans les premiers temps.

Les trois stades du marché

Logiciels de tenue des enregistrements électroniques

Le premier stade débute en 1983. À l’époque, quelques organismes gouvernementaux fédéraux du Canada, dirigés par un groupe du nom de Centre canadien de recherche sur l’informatisation du travail (CITI), ont accordé quelques subventions assez conséquentes à des entreprises de technologie prometteuses afin que ces dernières mettent au point de nouvelles technologies d’informatisation du travail. Chaque subvention était accordée par un ministère différent. Les subventions s’inscrivaient en quelque sorte dans le cadre d’un concours et chacune d’elles devait servir à financer une proposition pour un projet de recherche et de développement. Les propositions soumises étaient jugées et un gagnant était choisi. Ce dernier recevait une subvention additionnelle du gouvernement dans l’espoir d’encourager la mise en marché de la technologie proposée. À cette époque, j’avais terminé mes études depuis un an et j’étais à l’emploi d’une entreprise montréalaise qui travaillait à créer le premier ordinateur portatif au monde, le Bytec’s Hyperion, à Ottawa.

L’entreprise pour laquelle je travaillais s’était vue accorder une subvention par les Archives nationales du Canada (ANC) et le ministère des Communications (MDC). Le ministère des Communications voulait stimuler la croissance du secteur des technologies. Les Archives nationales du Canada désiraient quant à elles se doter de nouvelles capacités technologiques pour gérer les enregistrements électroniques. C’est moi qui étais alors chargé de mettre au point une proposition pour une nouvelle technologie. J’ai donc mis sur pied une équipe composée de jeunes spécialistes des logiciels de mon entreprise et d’intervenants clés des Archives nationales du Canada et du ministère des Communications. Les Archives nationales du Canada dirigeaient les efforts pour définir les exigences. Le ministère des Communications agissait à titre d’exemple de consommateur du gouvernement fédéral de la nouvelle technologie et contribuait à définir les exigences. Cette technologie, nous l’avons appelée Foremost (Formal Records Management for Office Systems Technology). Notre proposition a remporté le concours; toutefois, d’ici à ce qu’un projet officiel puisse être mis sur pied, l’entreprise pour laquelle je travaillais était en train de s’écrouler. L’avenir paraissait bien sombre, si ce n’est que nous détenions une formidable idée de nouveau logiciel.

Mon équipe et moi avons alors décidé de former une nouvelle entreprise pour mettre au point la technologie que nous avions proposée. Nous avons tous réuni un peu d’argent, et c’est ainsi que notre entreprise est née. Nous avons obtenu du financement de capital-risque pour assurer la continuité de nos activités afin que nous puissions créer ce nouveau logiciel. Les Archives nationales du Canada et le ministère des Communications gardaient espoir que nous puissions un jour fournir le produit commercial dont ils avaient besoin, et ils ont donc continué à nous donner des conseils et des orientations sur les fonctions que nous devions offrir. Le développement du logiciel s’est poursuivi jusqu’à ce que nous puissions en livrer une première version. Entre temps, une autre entreprise de consultants en logiciels d’Ottawa développant un produit concurrent s’était lancée dans la course. Nous avions également eu vent de l’existence d’une deuxième entreprise concurrente, australienne celle-là.

Nous avons ainsi continué, imperturbables, à faire croître notre entreprise, et les Archives nationales du Canada ont testé et validé notre logiciel. Nous commencions à recevoir beaucoup d’attention aux États-Unis, et le monde entier montrait un intérêt croissant, surtout les organismes gouvernementaux, qui en étaient venus à réaliser qu’un nombre sans cesse grandissant de leurs enregistrements étaient en format électronique et que ces derniers devaient faire l’objet des mêmes contrôles obligatoires en matière de tenue des enregistrements que les autres enregistrements. Le directeur des Archives nationales du Canada, John Macdonald (actuellement retraité), était un promoteur clé du projet, et il a sans aucun doute influencé les services des archives d’autres pays grâce à ses efforts en faveur de ce type de technologie. On dit que c’est grâce à lui que les travaux des Archives nationales du Canada sont devenus un facteur déterminant dans l’élaboration ultérieure de la norme en matière de tenue des enregistrements électroniques du Conseil international des archives (CIA).

De mille et une façons, nous étions la petite entreprise de logiciels en démarrage typique. Tout ce que nous faisions était nouveau, du jamais vu. Il est vrai que nous avions des concurrents, toutefois, nous savions très bien que nous avions une longueur d’avance sur eux en matière d’innovation. Notre produit connaissait un développement constant et de nouvelles versions étaient lancées chaque année. Nous commencions à réaliser quelques ventes auprès de premiers utilisateurs, mais aucune vente assez importante pour assurer la rentabilité de l’entreprise, encore moins sa réussite financière. Nous avions soulevé beaucoup d’intérêt, y compris de l’armée américaine et de sociétés d’envergure, mais le fait que nous étions une petite entreprise en démarrage canadienne a découragé plus d’un acheteur potentiel.

En moins de deux, une décennie s’était écoulée. Nos concurrents devenaient de plus en plus solides et ajoutaient de nouvelles fonctionnalités, tout comme nous d’ailleurs. Notre atout, c’était le financement en capital-risque : nous disposions de suffisamment de capital pour bâtir une équipe de développement solide axée sur la création de nouvelles fonctionnalités innovantes sans dépendre des revenus provenant des ventes.

Puis, il y a eu la guerre du Golfe. Les anciens combattants de retour de mission se plaignaient d’une maladie jamais observée auparavant qui vint à être connue sous le nom de « syndrome de la guerre du Golfe ». Quel est le rapport entre le syndrome de la guerre du Golfe et la tenue des enregistrements électroniques? Il s’est avéré que ces deux éléments sont très liés. Les anciens combattants de la Guerre du Golfe avaient intenté un recours collectif contre l’armée américaine dans le but de recevoir une indemnisation. Bien que je ne me souvienne pas de l’issue du litige, je sais qu’il a coûté cher à l’armée américaine en argent des contribuables. Le Pentagone est par la suite arrivé à la conclusion que l’incapacité de reconnaître la maladie provenait en grande partie de la destruction des dossiers médicaux électroniques des militaires sur les ordinateurs de l’armée. Dans le but de prévenir la perte de dossiers médicaux électroniques à l’avenir, le Pentagone a exigé de toutes les agences militaires américaines qu’elles gèrent de manière adéquate leurs dossiers médicaux électroniques. Afin d’accélérer la transition vers la tenue des enregistrements électroniques, le Pentagone a développé une norme électronique obligatoire, US DoD-5015.02-STD, qui spécifie que tous les logiciels de tenue des enregistrements électroniques utilisés pour le stockage d’enregistrements électronique doivent respecter les exigences énoncées dans cette norme.

Le Pentagone est allé plus loin encore. Ils ont pris la décision audacieuse de mettre sur pied un organisme d’essai pour certifier les produits logiciels des fournisseurs dans le but d’assurer que les exigences très strictes de cette nouvelle norme sont respectées. Ce nouvel organisme d’essais était basé à Fort Huachuca, en Arizona, sous l’égide du Joint Interoperability Test Command (JITC).

Au fil du temps, cette norme a complètement changé la donne sur le marché. Désormais, l’armée américaine, qui est sans conteste le plus grand consommateur de technologies de tenue des enregistrements au monde, ne pouvait acheter que des technologies respectant cette nouvelle norme. La National Archives and Records Administration (NARA) des États-Unis a adopté la norme et a recommandé que tous les organismes gouvernementaux des États- Unis s’y conforment, ce qui a déclenché une réaction en chaîne. En peu de temps, un grand nombre d’agences d’État et d’agences locales américaines ont adopté la norme. Beaucoup de sociétés ont même suivi le mouvement. Leurs demandes de propositions (DDP) pour l’achat d’un logiciel de tenue des enregistrements comprenaient la conformité obligatoire à la norme 5015.02. Cette tendance s’étendra par la suite au Canada et en Europe. D’après mes estimations, près de la moitié des demandes de propositions de l’époque exigeaient obligatoirement la conformité à la norme 5015.02.

Seulement voilà, un problème se posait.

Aucun logiciel ne pouvait être testé. En effet, aucun logiciel conforme à cette nouvelle norme n’avait jamais été écrit. Mon équipe et moi avons apporté des contributions substantielles à l’équipe chargée du développement de la norme. Toutefois, le JITC prenait grand soin de demeurer neutre par rapport aux fournisseurs. Je me souviens d’avoir eu maintes discussions au sujet de ce qui faisait partie du domaine du possible par rapport à ce qui faisait partie du domaine du pratique. La norme qui en résulte comportait une quantité impressionnante de capacités inédites. J’étais d’avis que notre nouveau logiciel respectait la majeure partie des exigences de base, mais un travail considérable devait être accompli pour respecter la norme. La course était lancée pour obtenir la certification à cette nouvelle norme. Nous étions loin d’être la seule entreprise dans la course : toutes les entreprises du domaine des logiciels devaient maintenant obtenir cette certification au risque de perdre d’importantes parts de marché.

J’étais conscient que notre entreprise jouait le tout pour le tout – il fallait aller de l’avant. Nous avons donc fortement augmenté notre financement de capital-risque pour doter notre équipe de davantage de compétences et de ressources et nous la date du premier test de certification de notre logiciel à Fort Huachuca à été fixée.

Encore une fois, un problème se posait.

Comment le JITC allait-il tester un logiciel qu’il n’avait jamais vu avant? Des procédures de test devaient être écrites et, pour ce faire, le JITC devait connaître ce qu’il testait. Or, il n’avait jamais vu de logiciel de ce genre. Le JITC était novice dans la tenue des enregistrements, sans parler de ce nouveau type de logiciel. Pour résumer en quelques points cette histoire, j’ai visité à de nombreuses reprises les laboratoires de Fort Huachuca, à la frontière mexicaine, sur une période de plusieurs mois. Je m’étais habitué à fréquenter les gens du laboratoire, et nous nous connaissions bien. Fort Huachuca, base militaire active équipée d’un aéroport de drones, était une base militaire américaine très animée tout à fait typique et je me suis bien entendu avec les nombreux autres entrepreneurs et civils qui y travaillaient avec des permis de jour.

Une fois les procédures de test finalisées, le grand jour du test de notre logiciel était enfin arrivé – le tout premier logiciel que l’équipe du JITC allait tester. Ce jour-là, je me suis présenté pour rejoindre l’équipe du labo, comme à mon habitude. La sécurité était d’une rigueur inhabituelle. Depuis ma dernière visite au labo, les États-Unis avaient déclenché la deuxième Guerre du Golfe. Je l’ignorais, mais toutes les bases militaires rehaussent le niveau de sécurité quand le pays est en guerre. J’ai été stoppé à l’entrée par la police militaire et on m’a demandé de montrer ma carte d’identité. Avant cette journée, mon passeport canadien était la seule pièce d’identité que je devais présenter, mais les choses avaient changé. Le jour du test, on m’a escorté hors du désert de l’Arizona, mon passeport canadien à la main. Mon ordinateur portatif contenant toutes les données et les procédures de test a été confisqué, et les données effacées pour toujours. Je ne serai jamais vraiment en mesure de raconter précisément comment nous nous remis de cette mésaventure, mais je me contenterai de dire que l’histoire implique des gens sympathiques de l’armée américaine, le désert brûlant et beaucoup de Poulet frit Kentucky et de Coca-Cola sur une période de plusieurs jours.

C’est ainsi que le tout premier logiciel certifié 5015.02 est apparu. Les entreprises de logiciels faisaient la queue pour pouvoir fixer la date de leur test de certification auprès du JITC et la liste d’attente était de deux ans. Le monde des logiciels de tenue des enregistrements électroniques n’a plus jamais été le même. La certification 5015.02 était devenue l’objectif principal de tout logiciel. Cette norme obligeait tous les fournisseurs à doter leur logiciel de fonctionnalités bien précises, au risque de perdre des parts de marché. Elle a au bout du compte façonné l’architecture et l’ensemble des fonctions globales de tous les logiciels certifiés. Notre concurrent local d’Ottawa a par la suite obtenu la certification, de même que notre concurrent australien.

Une fois que la norme américaine a été bien établie, l’Australie a suivi avec sa propre norme, Victorian Electronic Records Strategy (VERS). Ce n’est que plusieurs années plus tard que les Australiens ont développé un régime de tests de logiciels à l’appui de cette norme. D’autres normes ont commencé à apparaître :

  • Module 2 de l’ICA, Principes et exigences fonctionnelles pour les enregistrements dans les environnements électroniques (Principles and Functional Requirements for Records in Electronic Environments). Cette norme est par la suite devenue la norme ISO 16175-2:2011;
  • MoReq (MoReq1, MoReq2). Prévue à l’origine comme norme européenne.

Il m’a semblé que toutes ces normes cherchaient d’une façon ou d’une autre à dépasser ou à tout le moins à compléter la norme américaine. À l’époque, je considérais que la norme US DoD 5015.02 était absolument essentielle pour réaliser des ventes sur le marché américain. La norme MoReq semblait à mon avis demander la lune du logiciel, comme la capacité de « préserver la lisibilité des documents électroniques pour toujours ». Bonne chance! J’ai eu bien du mal à prendre cette norme au sérieux, et je n’ai jamais rencontré un seul acheteur qui exigeait la conformité à cette norme. J’avais beaucoup d’admiration pour la norme VERS étant donné qu’elle est fondée sur la norme 5015.02, mais il faut garder à l’esprit que seuls les acheteurs australiens s’en préoccupaient, tandis que j’étais à mon avis, tout comme la plupart de mes concurrents, plus orienté vers le marché américain.

Qu’en est-il du Canada? Le gouvernement fédéral, par l’entremise du Conseil du Trésor du Canada, avait fini par choisir le Module 2 de l’ICA comme norme pour les logiciels du gouvernement du Canada. Cette norme est fort différente de la norme 5015.02, en ce sens qu’elle est rédigée du point de vue de l’archiviste, et elle est donc surtout axée sur les capacités à l’appui de la préservation des enregistrements. La norme de l’ICA n’était tout simplement pas à ce point centrée sur le cycle de vie actif des enregistrements, et la majeure partie des spécifications se situaient en aval du cycle de vie active. En outre, la contribution à cette norme provenait en majeure partie des pays du Commonwealth, et c’était très manifeste. Cette norme comprend le vieux système numérique par blocs ainsi que les niveaux de sécurité (Confidentiel, Secret, Protégé, Très secret…). Le Module 2 de l’ICA exige un nombre époustouflant de 275 capacités, contre 168 capacités pour la norme 5015.02, dont certaines, un peu comme c’est le cas pour la norme MoReq1, représentaient simplement un idéal et étaient impossibles à intégrer avec la technologie de l’époque.

Je ne plaide aucunement en faveur d’une norme par rapport à une autre; toutes les normes ont leur utilité. Toutefois, je m’inquiète quand je vois le mot « devrait » dans une norme de logiciels. Il est impossible de tester une fonction de logiciel par rapport à une exigence contenant le mot « devrait ». À mon sens, une norme de logiciels n’en est véritablement une que lorsqu’il est possible de la tester par rapport à la technologie du monde réel. Toutes les autres normes représentent une sorte d’idéal, et aucun fournisseur de logiciels ne devrait déclarer un logiciel conforme à une norme sur cette base, bien que certains le fassent.

Comment les entreprises de logiciels de tenue des enregistrements électroniques s’en sortaient-elles? Des trois entreprises que j’ai mentionnées jusqu’à maintenant, l’entreprise australienne était celle qui s’en tirait le mieux, étant donné qu’elle semblait vendre un plus grand nombre de logiciels, bien qu’essentiellement dans les pays ceinture du Pacifique, et cette entreprise affichait une croissance plus vigoureuse et plus rapide que les autres. Elle s’était concentrée sur le marché local australien et sur le marché européen, et tout allait bien pour elle. Cette entreprise a crû jusqu’à atteindre un effectif d’environ 35 personnes. L’entreprise australienne avait obtenu la certification US DoD 5015.02 sur le tard, mais elle a fini par établir une tête de pont aux États-Unis en embauchant un ancien pilote très respecté de l’armée de l’air américaine, qui était devenu gestionnaire du programme des dossiers militaires, pour la représenter aux États-Unis. L’entreprise australienne offrait surtout des solutions axées sur les documents physiques (chemises de classement papier, boîtes). Cette entreprise m’a toujours semblé reléguer la gestion des enregistrements électroniques au second plan. Notre concurrent local d’Ottawa vendait lui aussi surtout des solutions de tenue des enregistrements en format papier, la tenue des enregistrements électroniques étant accessoire. Cette entreprise a obtenu assez rapidement la certification 5015.02 et elle a remporté un appel d’offres du gouvernement canadien dans le cadre duquel le plus bas soumissionnaire l’emporte pour un système qui allait être connu sous le nom de SGDDI (Système de gestion des dossiers, des documents et de l’information)2. Malgré notre déception, nous étions convaincus que notre avenir était lié au marché américain et non pas au marché canadien. Contrairement à nos concurrents à l’époque, nos solutions étaient entièrement axées sur les enregistrements électroniques, les documents physiques faisant partie des fonctionnalités secondaires.

Pourquoi donc n’avons-nous pas mis l’accent sur les documents sur papier? Parce que notre mission était de fournir des solutions pour les enregistrements électroniques et non pas pour les documents sur papier. Le monde était inondé de solutions de gestion de documents sur papier satisfaisantes. Nous sommes demeurés entièrement fidèles à notre mission.

La première certification 5015.02 a galvanisé l’industrie des logiciels de tenue des enregistrements électroniques tout entière. Les acheteurs éventuels exigeaient la conformité à cette norme, à défaut de quoi il était impossible de vendre des logiciels dans la majeure partie du marché américain. Et pendant un certain moment, seule une petite entreprise de logiciels canadienne se conformait à cette norme. Soudainement, les grandes entreprises de logiciels de gestion de documents tendaient l’oreille…

 

Gestion de documents

Au même moment où le marché des logiciels de tenue des enregistrements électroniques se développait de 1983 à 2002, un marché parallèle émergeait : la gestion de documents. Les produits de gestion des documents offraient un répertoire organisé pour tous les documents électroniques créés et de puissantes capacités de recherche pour que les utilisateurs puissent trouver les documents dont ils ont besoin. Il était en effet absurde que tous les utilisateurs d’une entreprise stockent leurs documents uniquement dans les ordinateurs individuels. La demande pour la gestion de documents était croissante. Il ne s’agissait pas d’un petit marché comme celui de la tenue des enregistrements électroniques – on parle ici d’un marché absolument gigantesque. À partir de l’année 2002, l’usage de ce type de logiciel était devenu généralisé dans les entreprises à travers le monde. À l’époque, comme c’est le cas aujourd’hui, il semblait que pratiquement toutes les sociétés comptant en leur sein plus de quelques centaines d’utilisateurs avaient besoin d’un système de gestion de documents.

Le secteur de la gestion de documents était en croissance, tant sur le plan de la maturité de marché que des capacités et, à partir de 2002, il s’était transformé en « gestion de contenu », ou, plus précisément, en gestion de contenu d’entreprise (Enterprise Content Management ou ECM). IBM avait sa propre plateforme, Content Manager. OpenText s’était lancée dans ce marché avec sa plateforme de gestion de contenu d’entreprise LiveLink et, grâce à plusieurs acquisitions, dont celle de l’entreprise canadienne Hummingbird, a pu faire progresser sa plateforme de gestion de contenu d’entreprise Content Server. FileNet avait son produit de gestion de contenu d’entreprise très connu, FileNet P8, surtout axé sur la gestion des images. Mais le précurseur était Documentum, une entreprise californienne. Grâce ses solutions de gestion de contenu d’entreprise à grande échelle pour les sociétés pharmaceutiques d’envergure partout dans le monde, Documentum dominait complètement le secteur pharmaceutique. Si c’était gros, c’était bien souvent Documentum. On ne peut pas dire qu’IBM était à la traîne non plus. L’entreprise avait des installations partout dans le monde, notamment chez un client très connu, la Social Security Administration des États-Unis, qui comptait plus de 200 000 utilisateurs. Hewlett Packard (HP) a par la suite pénétré le marché des plateformes de gestion de contenu d’entreprise grâce à l’acquisition d’Autonomy. Il importe d’insister sur le fait que ces « nouvelles » plateformes de gestion de contenu d’entreprise ne s’inscrivaient en fait que dans l’évolution continue de la gestion de documents, la seule différence étant leurs noms plus accrocheurs et leurs capacités sans cesse grandissantes. Le secteur des logiciels de tenue des enregistrements était quant à lui un secteurde marché spécialisé minuscule exploité par un petit nombre de fournisseurs de logiciels de taille relativement modeste. Ces deux marchés ne se chevauchaient pas encore de manière importante. La gestion de contenu d’entreprise était la gestion de contenu d’entreprise et la gestion des enregistrements était la gestion des enregistrements, tout simplement.

Les fusions étaient incessantes dans le secteur de marché des plateformes de gestion de contenu d’entreprise. Documentum a été acquise par EMC, et EMC a depuis été acquise par OpenText. IBM a quant à elle fait l’acquisition de FileNet.

Ces sociétés n’ont accordé que peu d’attention au marché de la tenue des enregistrements électroniques. Elles n’avaient après tout aucune raison de le faire. Toutefois, avec la venue des premières certifications 5015.02, la conformité à la norme US DoD 5015.02 apparaissait de plus en plus souvent comme exigence obligatoire dans les demandes de propositions auxquelles elles répondaient. Plusieurs acheteurs et clients potentiels du gouvernement américain leur répondaient « Nous ne pouvons plus acheter vos produits à moins qu’ils soient conformes à la norme 5015.02 ». Au Canada et sans doute dans d’autres pays, la conformité à la norme 5015.02 faisait partie des exigences de bon nombre de demandes de propositions relatives à la technologie de gestion de contenu d’entreprise. Il y avait toutefois un problème…

Les fournisseurs de gestion de contenu d’entreprise ne connaissaient pas du tout la norme 5015.02, et ils ont donc été pris complètement par surprise. Deux choix s’offraient à eux : développer ou acheter. Ils pouvaient soit concevoir et développer ces nouvelles fonctions singulières pour leurs produits et, à terme, obtenir la certification, ou alors acheter des technologies existantes et les incorporer à leurs propres produits. Rappelons qu’à ce moment, nous avions déjà passé plus d’une décennie à concevoir, à développer et à perfectionner cette technologie, tout en étant centré exclusivement sur ce créneau spécialisé. Pour obtenir la certification à la norme 5015.02, un savoir considérable, une grande équipe de développeurs hautement qualifiés et beaucoup de temps, d’efforts et d’argent ont été nécessaires. Tout ce travail ne pouvait s’accomplir facilement ou rapidement, même pour la toute puissante IBM.

Les fournisseurs de plateformes de gestion de contenu d’entreprise, IBM, FileNet, Documentum et OpenText, avaient tous besoin de cette certification – et le plus tôt possible, l’absence de certification ayant une incidence négative sur leurs quotas de vente. Il existait trois cibles potentielles d’acquisition : mon entreprise à Ottawa, notre concurrent local d’Ottawa et notre concurrent australien. Quatre entreprises avaient besoin de la certification et il n’existait que trois petites entreprises spécialisées qui détenaient la technologie convoitée.

Voici donc ce qui s’est passé. J’ai quitté ma propre entreprise et j’en ai fondé une autre pour développer la technologie de deuxième génération que nous estimions être essentielle pour réussir. Mon ancienne entreprise avait été acquise par Documentum. Ma nouvelle entreprise a quant à elle été acquise par IBM. Mon concurrent local d’Ottawa a été acquis par OpenText. Pour ce qui est de l’entreprise australienne, elle décidé de faire cavalier seul. FileNet, laissée pour compte dans ce jeu des acquisitions, n’avait pas d’autre choix que de se doter elle-même de fonctionnalités de tenue des enregistrements, et elle s’est attelée tout de suite à la tâche. Plusieurs années plus tard, notre concurrent australien a finalement été acquis par Hewlett-Packard.

À l’exception notable de l’entreprise australienne dont j’ai déjà parlé, les entreprises de logiciels de tenue des enregistrements électroniques naissantes avaient pratiquement disparu du jour au lendemain, englouties dans la course que se livraient les fournisseurs de plateformes de gestion de contenu d’entreprise pour être conformes à la norme DOD 5015.02 avant leurs concurrents. Les quelques petits fournisseurs spécialisés avaient donc été avalés – il ne restait plus que l’entreprise australienne. C’était maintenant les fournisseurs de plateformes de gestion de contenu d’entreprise qui menaient la barque. Encore aujourd’hui, je suis d’avis que c’est une bonne chose pour le marché.

J’en suis venu à constater que les sociétés devant se conformer à des exigences en matière de tenue des enregistrements n’avaient pas d’autre choix que d’acheter les logiciels répondant à leurs besoins (habituellement, la conformité à la norme 5015.02), ce qui ne signifie pas nécessairement que ces dernières utilisaient les fonctions qu’elles avaient acquises. La tenue des enregistrements n’était désormais qu’une fonctionnalité parmi d’autres de la technologie de gestion de contenu d’entreprise moderne. Ceux qui désiraient se doter de fonctionnalités de tenue des enregistrements électroniques devaient d’abord se procurer une plateforme de gestion de contenu d’entreprise. La tenue des enregistrements était devenue un passage obligé dans la plupart des demandes de propositions pour l’acquisition d’une plateforme de gestion de contenu d’entreprise. Les ventes de logiciels de gestion de contenu d’entreprise continuaient de croître, et cela faisait l’affaire de tout le monde.

Je ne peux pas me prononcer au sujet des autres acquisitions, mais IBM a accompli un travail remarquable pour absorber mon entreprise. L’intégration, qui était de la plus haute importance pour IBM, a été remarquablement bien effectuée. Or, j’étais tout sauf content de la situation. En effet, après quelques années, j’avais remarqué que malgré toutes les ventes et les livraisons réalisées de solutions de tenue des enregistrements électroniques, peu d’entreprises acquéreuses avaient vraiment procédé au déploiement de ces solutions. IBM n’était pas la seule concernée; pratiquement tout le marché l’était. Les logiciels de gestion de contenu d’entreprise étaient déployés avec des fonctionnalités de tenue des enregistrements, mais absolument rien n’indiquait que ces nouvelles fonctionnalités des plateformes de gestion de contenu d’entreprise servaient véritablement à gérer des enregistrements électroniques.

J’ai vu beaucoup d’entreprises qui prétendaient gérer leurs enregistrements électroniques. En analysant plus en profondeur leurs projets, plusieurs points agaçants ressortaient, et ce de manière récurrente. Beaucoup (beaucoup trop) d’entreprises ne savaient absolument pas comment déployer les fonctionnalités de tenue des enregistrements électroniques. Certaines entreprises avaient inclus la tenue des enregistrements dans leur commande de plateforme de gestion de contenu d’entreprise parce qu’elles étaient obligées de le faire, mais le déploiement n’avait jamais été effectué (shelfware). Beaucoup ont tenté de gérer les enregistrements électroniques et ont échoué, et sont retournés à la gestion de documents sur papier seulement, laissant les enregistrements électroniques non gérés. J’ai vu une grande entreprise américaine très connue s’y prendre à trois reprises et dépenser des millions de dollars pour le déploiement de la technologie de tenue des enregistrements avec deux différents fournisseurs de logiciels de gestion de contenu d’entreprise et qui a fini par baisser les bras.

À mon avis, le succès d’un déploiement de la tenue des enregistrements se mesure à deux critères simples. Premièrement, les documents électroniques doivent être gérés adéquatement à titre d’enregistrements sur une base régulière et quotidienne par tous les utilisateurs. Deuxièmement, le gestionnaire des enregistrements de l’entreprise doit procéder à la disposition des enregistrements électroniques conformément au calendrier de conservation des enregistrements électroniques. J’ai entendu maintes et maintes fois des histoires de gestion supposément réussie des enregistrements électroniques. Or, chaque fois que j’ai analysé plus en profondeur ces projets à la recherche de mes deux critères de réussite, je ne les ai jamais trouvés.

Les logiciels de gestion de contenu d’entreprise respectaient de toute évidence les exigences en matière de tenue des enregistrements, du moins celles énoncées dans la norme 5015.02. Bon nombre de fournisseurs de logiciels de gestion de contenu d’entreprise sont allés bien au-delà de la norme 5015.02 et offraient des fonctions comme la recherche de contenu, l’investigation informatique et la gestion de dossiers physiques. La plupart offraient même la gestion de courriels sous une forme ou sous une autre, un élément crucial de la tenue des enregistrements électroniques étant donné qu’une grande partie des enregistrements électroniques sont des courriels. Les plateformes de gestion de contenu d’entreprise offraient de puissantes fonctionnalités. Des entreprises d’envergure mondiale comptant en leur sein des gens hautement compétents fournissaient des solutions de gestion de contenu d’entreprise efficaces.

Nous sommes donc en droit de nous demander pour quelle raison le déploiement de la composante « tenue des enregistrements » de ces solutions de gestion de contenu d’entreprise s’est révélé être un échec alors même que le projet de gestion de contenu d’entreprise sous-jacent était un franc succès. En réalité, il existait un certain nombre de raisons claires et évidentes. Au fil des ans, j’ai commencé à comprendre les causes profondes (les facteurs liés aux projets) pour plusieurs projets, pour différentes technologies de gestion de contenu d’entreprise et pour plusieurs types d’entreprises qui empêchaient la tenue électronique de documents de prendre racine et d’être couronnée de succès. Parfois, un seul facteur pouvait être fatal à la tenue des enregistrements et parfois, il s’agissait une combinaison de plusieurs facteurs. Dans tous les cas, l’issue était malheureuse : la tenue des enregistrements électroniques ne se faisait pas. Voici donc les quatre facteurs liés aux projets que j’ai observés et que j’observe encore aujourd’hui.

 

La faible priorité accordée à la tenue des enregistrements. Le passage vers une plateforme de gestion de contenu d’entreprise à l’échelle d’une société est un projet d’envergure qui est fort coûteux et périlleux, et ce pour toute entreprise. Ce passage constitue un virage important dans l’environnement informatique (TI) d’une entreprise et est extrêmement déstabilisant pour les utilisateurs. Les projets de gestion de contenu d’entreprise sont très coûteux en plus de mettre en jeu la réputation. Leur succès est de première importance, et il y a peu de tolérance pour l’échec. Dans certains cas, les employés des TI sentent que leur réputation et même leur emploi sont en jeu en cas d’échec. Cette obligation de réussir ne s’applique pas à la composante « tenue des enregistrements » du projet, qui n’est qu’accessoire. Il ne viendrait à personne l’idée d’annuler un projet de gestion de contenu d’entreprise si le déploiement de la composante « tenue des enregistrements » n’est pas satisfaisant. Personne ne risque de perdre sa réputation ou son emploi. En cas d’échec de la composante « tenue des enregistrements » d’un projet de gestion de contenu d’entreprise, le projet se poursuit, sans la tenue des enregistrements. À l’échelle du projet de gestion de contenu d’entreprise, l’importance relative de la tenue des enregistrements est faible. Beaucoup d’entreprises ne peuvent se permettre que le projet de gestion de contenu d’entreprise échoue, ce qui rend en pratique la composante « tenue des enregistrements » du projet facultative. D’après mon expérience, et c’est encore le cas aujourd’hui, la tenue des enregistrements se solde par un échec dans un trop grand nombre de déploiements de plateformes de gestion de contenu d’entreprise, les professionnels de la gestion des documents et de l’information au sein des entreprises se contentant de gérer des documents sur papier, les fonctionnalités avancées de tenue des enregistrements électroniques demeurant ainsi inutilisées.

Les fournisseurs de plateformes de gestion de contenu d’entreprise savent très bien qu’ils peuvent vendre leurs produits pourvu qu’ils respectent les exigences. Si le déploiement de la tenue des enregistrements tourne mal, ils ont peu de chance d’avoir à subir la colère d’un acheteur qui, dans de trop nombreux cas, n’a pas une assez bonne compréhension de cette technologie pour savoir comment la gérer.

À mon avis, cette situation est la conséquence non voulue qui découle du fait que la tenue des enregistrements n’est qu’une fonction de la plateforme de gestion du contenu d’entreprise. Au début, tout le monde s’entendait pour dire que la tenue des enregistrements était une fonction absolument essentielle.

Puis est venu la gestion de contenu d’entreprise, ensuite l’échec de la tenue des enregistrements. Alors, doit-on laisser tomber les projets de gestion de contenu d’entreprise et tout recommencer? Bien sûr que non! La gestion de contenu d’entreprise est un choix sûr, toujours sûr, contrairement à la tenue des enregistrements.

 

Les professionnels de la gestion des documents et de l’information sont mal outillés. Certains professionnels de la gestion des documents et de l’information interprètent mal mon avis sur la question et pensent que je les crois incapables de gérer cette technologie, mais rien n’est moins vrai. De nos jours, les professionnels de la gestion des documents et de l’information sont plus versés en technologie, mieux éduqués et mieux équipés que jamais pour gérer ce type de technologie. Beaucoup d’entre nous sont malgré tout encore vulnérables face aux progrès de la technologie de gestion de contenu d’entreprise. Le gestionnaire des enregistrements en entreprise doit avoir une compréhension intime des nouvelles capacités de tenue des enregistrements du nouveau logiciel de gestion de contenu d’entreprise. Ça fait beaucoup à apprendre et à maîtriser, et habituellement dans un laps de temps très court. Ils doivent être entièrement à l’aise avec la technologie de gestion de contenu d’entreprise et se poser les questions suivantes : Comment ça marche? Comment les métadonnées sont-elles définies? Comment puis-je contrôler les valeurs des métadonnées? Comment dois-je gérer les documents en tant qu’enregistrements? Que dois-je exiger des utilisateurs? Ces professionnels doivent en outre intégrer le calendrier de conservation dans le logiciel de gestion de contenu d’entreprise. Autant dire que c’est mission impossible. La plupart des calendriers de conservation ne sont pas structurés adéquatement pour les systèmes de gestion de contenu d’entreprise modernes, une manipulation des données s’imposant à tout le moins ou même une restructuration complète dans certains cas.

Les professionnels de la gestion des documents et de l’information doivent d’abord maîtriser la technologie de gestion de contenu d’entreprise, ensuite maîtriser les capacités de tenue des enregistrements électroniques, puis faire la refonte du calendrier de conservation et l’intégrer et finalement influencer fortement la manière dont la plateforme de gestion de contenu d’entreprise est configurée et déployée afin que la tenue des enregistrements fonctionne correctement. Et tout ce travail doit habituellement être accompli dans une semaine de 40 heures de travail bien remplie en sus des tâches liées aux responsabilités quotidiennes de tenue des enregistrements. C’est donc tout un défi, même pour les meilleurs d’entre nous, et c’est le moins qu’on puisse dire.

Par ailleurs, je suis d’avis que les fournisseurs de gestion de contenu d’entreprises investissement peu de travail et d’argent dans la composante « tenue des enregistrements » par rapport à la plateforme prise dans son ensemble. Je connais même un fournisseur de gestion de contenu d’entreprise qui est passé à deux doigts d’abandonner complètement la tenue des enregistrements parce que cette fonction n’était à son avis

« pas assez importante ». D’après moi, la documentation, la formation, les services professionnels et le service après-vente liés aux fonctions de tenue des enregistrements sont souvent à la traîne par rapport aux autres fonctions de la plateforme de gestion de contenu d’entreprise. Je n’ai aucun doute sur le fait que ce phénomène traduit simplement la priorité accordée aux fonctions de tenue des enregistrements par rapport à la plateforme elle-même, mais il n’en demeure pas moins que pour la plupart des fournisseurs, il y a encore beaucoup de place à l’amélioration dans ce domaine. Au bout du compte, ce facteur constitue encore un autre obstacle à franchir pour les professionnels de la gestion des documents et de l’information pour maîtriser cette technologie.

La faute n’incombe nullement aux professionnels de la gestion des documents et de l’information, mais plutôt à l’ensemble du secteur, y compris aux fournisseurs et aux acheteurs. Le fossé éducatif se creuse plutôt que de diminuer.

 

Échec du déploiement de la plateforme de gestion de contenu d’entreprise. Les déploiements de plateformes de gestion de contenu d’entreprise ne sont pas tous réussis. Je n’ai jamais vu deux évaluations du succès d’un déploiement d’un système de gestion de contenu d’entreprise dans lesquelles le succès est défini de la même manière, encore moins une cohérence des résultats ou des mesures. Il est par conséquent impossible de connaître quelle est la proportion exacte de déploiements réussis de plateformes de gestion de contenu d’entreprise. Ce nombre, quel qu’il soit, n’est certainement pas près de 100 %. En d’autres mots, la plateforme de gestion de contenu d’entreprise est le navire dans lequel la tenue des enregistrements électroniques est transportée. Si ce navire est en mauvais état et qu’il est rejeté par les utilisateurs finaux, il n’y a rien que la tenue des enregistrements puisse faire pour inverser la situation. Les fonctionnalités de tenue des enregistrements électroniques de toutes les solutions de gestion de contenu d’entreprise exigent une adoption généralisée par tous les utilisateurs finaux et ne doivent pas seulement tenir compte des enregistrements électroniques en tant que tels (dont les courriels), mais également des métadonnées appropriées sur lesquelles repose la tenue des enregistrements, sans quoi il est pratiquement impossible de gérer les documents en tant qu’enregistrements en appliquant les contrôles appropriés, en les classant selon le selon le calendrier de conservation et en les disposant à la fin de leur durée de vie utile.

 

Le fossé qui sépare les professionnels de la gestion des documents et de l’information et les professionnels des TI. Parce que la tenue des enregistrements est une fonction de l’écosystème plus large de la plateforme de gestion de contenu d’entreprise, elle ne peut fonctionner en dehors cette dernière. Les enregistrements sont eux-mêmes des documents stockés dans la plateforme de gestion de contenu d’entreprise et sont complètement sous son contrôle. La plateforme doit être configurée pour déterminer les métadonnées à appliquer aux documents. La plateforme de gestion de contenu d’entreprise détermine et applique les autorisations relatives à la sécurité. Elle spécifie où les documents sont stockés en définissant des emplacements de fichiers comme les dossiers, les bibliothèques ou n’importe quelle autre nomenclature que la plateforme de gestion de contenu d’entreprise particulière applique à ces emplacements.

Par conséquent, pour que les fonctions de tenue des enregistrements des plateformes de gestion de contenu d’entreprise fonctionnent, les professionnels de la gestion des documents et de l’information doivent collaborer de très près avec l’équipe des TI pour configurer la plateforme de gestion de contenu d’entreprise. Toutefois, comme c’est souvent le cas, l’équipe des TI n’est pas à l’aise avec l’idée que les professionnels de la gestion des documents et de l’information leur dictent comment configurer la plateforme de gestion de contenu d’entreprise. Pire encore, peu de professionnels de la gestion des documents et de l’information sont à l’aise avec la configuration de systèmes de gestion de contenu d’entreprise complexes. Ils doivent apprendre de A à Z une technologie et des compétences entièrement nouvelles. Les rares professionnels de la gestion des documents et de l’information qui s’y sont attaqués avec ardeur se sont souvent heurtés à une fin de non-recevoir par les grands services des TI, qui allaient de l’avant avec la configuration de la plateforme de gestion de contenu d’entreprise selon leur propre plan sans tenir compte des professionnels de la gestion des documents et de l’information. Dans de trop nombreux cas, les TI ont procédé à la gestion, à la planification, à la configuration et au déploiement de la plateforme de gestion de contenu d’entreprise selon leur propre vision et leur propre approche, sans égard ou presque aux fonctionnalités de tenue des enregistrements. Le mandat, le poids politique ou le savoir-faire faisaient souvent défaut aux professionnels de la gestion des documents et de l’information pour qu’ils puissent exercer une influence suffisamment grande sur le déploiement de la plateforme de gestion de contenu d’entreprise pour assurer un déploiement réussi de la fonction de tenue des enregistrements électroniques. D’après mon expérience, plus une organisation est grande, plus ce fossé a des chances d’être large. Les TI procèdent au déploiement de la plateforme de gestion de contenu d’entreprise et les professionnels de la gestion des documents et de l’information gèrent les documents sur papier, essayant en vain d’influencer l’orientation de la gestion de contenu d’entreprise.

À partir de 2012, bon nombre de fournisseurs de plateformes de gestion de contenu d’entreprise commençaient à éprouver un certain malaise avec la composante « tenue des enregistrements » de leurs projets. Ils percevaient la tenue des enregistrements comme compliquée – beaucoup de soucis en somme. Ils se disaient qu’il existait sûrement une solution plus simple qui ferait l’affaire.

À partir de 2012, la plupart des produits de gestion de contenu d’entreprise venaient avec la capacité d’appliquer des

« politiques » relatives aux documents. Une politique s’entend d’un ensemble de comportements et de caractéristiques qui peuvent être appliqués automatiquement à un document. Une politique peut être appliquée à un emplacement (par exemple à un dossier) de manière à ce que tous les documents dans cet emplacement soient automatiquement soumis à celle-ci. Ces politiques précisent la durée de rétention des documents, le moment de leur destruction et tous les autres critères qui doivent être respectés avant l’effacement du document. Parfois, ces politiques respectaient le calendrier de conservation, mais ne le devaient pas toujours – il était tout à fait possible de créer n’importe quelles politiques sans tenir compte du calendrier de conservation.

Cette nouvelle fonctionnalité est devenue assez populaire au sein des produits de gestion de contenu d’entreprise, parce qu’elle était considérée comme une façon pour les clients d’effacer les documents sans avoir à se préoccuper des difficultés et des frais inhérents à la tenue des enregistrements, qui la plupart du temps n’avançait pas. Plusieurs entreprises ont donc adopté ces politiques simples de suppression de documents en fonctions de critères imposées par les TI ou de critères provenant des unités d’exploitation, sans tenir compte du calendrier de conservation. Ils estimaient qu’après tout, c’était mieux que rien. Quant à moi, j’estimais au contraire que c’était une tendance inquiétante, une confirmation de plus que le secteur dans l’ensemble ne faisait aucun progrès en matière de déploiement de tenue des enregistrements électroniques.

Encore aujourd’hui, bon nombre de fournisseurs de systèmes de gestion de contenu d’entreprise offrent des capacités de tenue des enregistrements formelles, en plus d’offrir ces « capacités de politiques » génériques. J’ai entendu à plusieurs reprises des fournisseurs qualifier ces politiques de « version light » de tenue des enregistrements », ce qui me révolte. Étant donné que j’ai consacré toute ma carrière au développement de la technologie de tenue des enregistrements électroniques, j’ai été profondément troublé de constater qu’aucun déploiement de tenue des enregistrements électroniques ou presque n’avait obtenu de réel succès et ne respectait mes critères simples. J’ai donc quitté IBM pour fonder un cabinet-conseil indépendant des fournisseurs dont l’unique mission est d’aider les acheteurs à réussir le déploiement de leurs projets de tenue des enregistrements électroniques.

C’est dans ce contexte que Microsoft est entrée dans l’arène de la gestion de contenu d’entreprise en 2001, avec son logiciel SharePoint, que peu de fournisseurs établis de plateformes de gestion de contenu d’entreprise ont pris au sérieux. Quiconque dans le secteur du logiciel sait très bien toutefois qu’il ne faut pas prendre Microsoft à la légère (voir le cas de Michael Cowpland). En 2012, Microsoft avait lentement mais sûrement accaparé une part de plus en plus grande du marché des fournisseurs de gestion de contenu d’entreprise, en partant de la base. Les fournisseurs établis comme EMC, IBM, et OpenText devaient maintenant faire attention : SharePoint était en croissance et était en train de devenir un adversaire à prendre au sérieux dans le marché. À l’instar de tous ses concurrents du marché de la gestion de contenu d’entreprise, Microsoft en est rapidement venue à s’attaquer à la « gestion des enregistrements ».

 

Microsoft

La lente et inexorable pénétration sur le marché de SharePoint de Microsoft a atteint un point tel que ce logiciel est devenu une force avec laquelle il faut compter. Voici quelques chiffres que j’ai glanés ici et là sur le Web :

  • SharePoint représente pour Microsoft un marché de deux milliards de
  • Selon Microsoft, il y a chaque jour 20 000 nouveaux utilisateurs de
  • 80 % des entreprises du classement Fortune 500 utilisent
  • Selon Microsoft, le taux de pénétration du marché des entreprises est de 66 %.

Les chiffres présentés ci-dessus ne sont pas à jour (je n’ai pas réussi à trouver des chiffres plus récents), mais il ne fait aucun doute que SharePoint a une forte présence au sein de l’entreprise moderne. Par conséquent, de plus en plus de sites SharePoint ont recours, comme vous l’auriez deviné, au stockage de documents qui doivent être gérés comme des enregistrements.

J’ai déterminé que la fin du stade précédent était l’année 2012, et ce pour une bonne raison dont je vais traiter plus tard. Pour le moment, j’aimerais retourner en arrière, en 2006, lorsque l’aventure de Microsoft dans la tenue des enregistrements a commencé.

En 2006, Microsoft a commencé à déployer SharePoint au sein de sociétés et d’agences gouvernementales américaines de plus en plus grandes. À cette époque, cette plateforme de gestion de contenu d’entreprise était devenue un acteur sérieux et ravissait des parts de marché à IBM, à FileNet, à OpenText et aux autres en commençant par le bas de l’échelle du marché. Certains des clients les plus précieux de Microsoft lui ont lancé un ultimatum : développez un produit conforme aux normes de tenue des enregistrements ou nous laisserons tomber SharePoint au profit d’un concurrent.

Lors de la sortie de l’édition 2007 de SharePoint, Microsoft a annoncé en grande pompe l’ajout de fonctionnalités de tenue des enregistrements. Du jour au lendemain, le Web était inondé d’articles de Microsoft et d’experts de SharePoint de tous genres qui expliquaient au monde entier comment gérer les enregistrements avec

SharePoint 2007. Il était désormais possible de créer des politiques d’effacement automatique des documents conformément à un calendrier de conservation. J’ai donc entrepris d’analyser en profondeur cette nouvelle fonctionnalité et de l’évaluer, pour mon compte et pour celui de mes clients. J’ai été accablé par ce que j’ai constaté.

J’ai lu tout ce que j’ai pu glaner le Web et je me suis même rendu au siège social de Microsoft à Redmond, dans l’État de Washington, pour discuter avec de ces fonctionnalités. J’ai rencontré pratiquement tous les membres de l’équipe de 5 personnes affectée à la gestion des enregistrements de SharePoint (dont l’un était Montréalais, un grand fan des Canadiens de Montréal). C’était des gens fantastiques, mais je peux vous dire qu’ils n’avaient absolument aucune expérience dans la gestion des enregistrements. J’ai même eu l’insigne honneur de m’entretenir avec l’architecte en chef de SharePoint de Microsoft.

À mon avis? Ils ont manqué leur coup. Il n’y avait tout simplement aucune manière dans SharePoint de gérer un calendrier de conservation adéquat ou d’y importer des données. La gestion adéquate des fiches-mère était impossible et le processus de disposition, inexistant. SharePoint se débarrassait tout simplement des documents à mesure que l’échéance arrivait.

Microsoft a déployé l’artillerie lourde sur le front de la tenue des enregistrements. Les très nombreux partenaires et experts de SharePoint réorientaient et publiaient à nouveau le message de base de Microsoft sur la tenue des enregistrements. Encore aujourd’hui, j’observe qu’il existe des milliers de pages Web qui nous expliquent en détail comment gérer des enregistrements avec SharePoint. J’ai expliqué à Microsoft en quoi elle s’était trompée. En guise de réponse, on m’a poliment expliqué la raison pour laquelle j’étais de toute évidence celui qui avait tort. Microsoft a envoyé leurs gestionnaires des enregistrements en campagne afin qu’ils racontent la manière dont Microsoft gère ses propres enregistrements avec SharePoint et comment le monde entier pouvait maintenant faire de même.

Ce n’était peut-être qu’une impression, mais je me sentais comme un paria. Je prêchais dans le désert et je contredisais directement la toute puissante Microsoft. Comme j’ai pu le constater, Microsoft disposait d’alliés très nombreux, soit son réseau mondial de partenaires et de distributeurs, qui s’enrichissaient tous grâce à la vente de services liés à SharePoint. Et naturellement, quand je soutenais auprès des entreprises que SharePoint ne gérait pas les enregistrements, les partenaires et les distributeurs leur disaient tout le contraire. Ils ont bénéficié de l’appui systématique de Microsoft. Qui m’a appuyé dans mes démarches? Personne, naturellement. Les professionnels de la gestion des documents et de l’information du monde entier doutaient fortement des affirmations de Microsoft, mais personne ne semblait être vraiment en mesure d’expliquer sur ce qui clochait avec SharePoint.

En 2009, Microsoft a affirmé qu’elle avait pris connaissance des problèmes liés à SharePoint, et elle a annoncé, encore une fois en grande pompe, qu’elle avait apporté une foule d’améliorations aux fonctions de tenue des enregistrements : Enregistrements sur place! Nouvelles fonctions d’organisation de contenu! Nouveau centre de gestion des enregistrements! Politiques de conservation par étapes!

Cette annonce abordait la plupart des principaux points. À l’époque, un certain niveau de scepticisme régnait désormais dans le marché au sujet de la capacité de Microsoft à gérer les enregistrements. Et encore une fois, je suis passé dans les bureaux Microsoft pour découvrir par moi-même ce qui avait changé. Ma conclusion : pas assez, du moins rien qui ne permette de surmonter les (trois) défauts fatals d’origine. J’étais fatigué de devoir répondre sans cesse aux mêmes questions : qu’est-ce qui cloche donc avec SharePoint? Peu de gens me prenaient au sérieux.

Microsoft continuait à offrir la fonctionnalité de tenue des enregistrements dans SharePoint, et disons simplement que je n’étais pas particulièrement fan du réseau mondial de partenaires de Microsoft.

J’en avais assez et je me devais de faire quelque chose. Ma réputation et ma crédibilité commençaient à en pâtir. J’ai donc entrepris de rédiger un rapport détaillé dans lequel j’expose les faits concrets. J’étais bien conscient qu’on mettrait en doute les moindres détails. J’ai donc soigneusement fait mes recherches et validé l’information avec Microsoft. Le rapport énonçait clairement les défauts de SharePoint et expliquait en détail la manière de le configurer pour faire les choses correctement. J’ai demandé à Microsoft de passer mon rapport en revue et d’en vérifier l’exactitude, ce qu’ils ont fait. Microsoft m’a autorisé à utiliser une déclaration stipulant qu’elle avait passé en revue le rapport pour en vérifier l’exactitude factuelle. J’ai distribué mon rapport aux personnes intéressées. ARMA International (https://members.arma.org/eweb/home.aspx?site=ARMASTORE) l’a publié sous forme de livre. En un rien de temps, le rapport circulait dans le monde entier. Que s’est-il passé à la suite de la publication? Rien, ce qui était à mon avis la meilleure issue possible. Le rapport n’a jamais été remis en question. Avec les années, et pas seulement grâce à mon livre, de plus en plus de gens ont réalisé que Microsoft s’était peut-être trompée. Les relations entre Microsoft et moi s’étaient même quelque peu réchauffées : j’ai été invité à joindre la tournée pancanadienne de promotion de SharePoint. J’ai accepté leur offre d’expliquer comment gérer les enregistrements avec SharePoint. Cette aventure n’a duré qu’une session. Microsoft m’a viré en raison de mon attitude supposément trop négative à l’égard de la tenue des enregistrements. J’étais de nouveau devenu un paria…

Revenons maintenant à 2012, année où commence le troisième stade. Certains fournisseurs éclairés de logiciels du secteur de la tenue des enregistrements ont vu un créneau de marché émerger en raison des difficultés que connaissait Microsoft dans la tenue des enregistrements. Un tout nouveau segment de marché était né : les logiciels compagnon (add-in) de tenue des enregistrements pour Microsoft SharePoint. En date du présent rapport, il n’existe que quatre fournisseurs qui offrent ce genre de solutions : deux aux États-Unis, un en Australie et un au Canada. Un seul des quatre fournisseurs détenait la certification à la norme 5015.02 et deux autres fournisseurs s’étaient engagés à l’obtenir.

Les acheteurs de SharePoint étaient finalement en mesure d’obtenir une véritable tenue des enregistrements. Les quatre fournisseurs de logiciels compagnon ont chacun adopté une approche radicalement différente pour parvenir à mettre en œuvre la tenue des enregistrements dans SharePoint, et ils y sont malgré tout tout parvenu, après un certain nombre de faux départs pour quelques fournisseurs. Qu’arriverait-il si Microsoft réussissait un jour à intégrer la fonction de tenue des enregistrements en tant que fonctionnalité native de SharePoint? Cela aurait sans doute pour effet de détruire le marché embryonnaire des logiciels compagnon de tenue des enregistrements. Je doute que cela se produise un jour. Microsoft soutient que tant qu’il existe un marché sain pour les logiciels compagnon de tenue des enregistrements, elle n’a aucun intérêt à plonger à nouveau dans le marché de la tenue des enregistrements. L’entreprise préfère plutôt mettre l’accent sur sa plateforme et encourager l’émergence d’un écosystème de produits et de services de tiers qui utilisent la plateforme pour offrir des solutions aux clients. Bref, tant que les gens achètent SharePoint, ça fait l’affaire de Microsoft. Si Microsoft est satisfaite et que les fournisseurs de logiciels compagnon de tenue des enregistrements le sont également, espérons que ce sera également le cas des acheteurs de SharePoint! De mon point de vue, jusqu’ici, tout va bien. Tous les deux ans, un nouveau fournisseur de logiciels compagnon de tenue des enregistrements arrive sur le marché et j’attends avec impatience l’arrivée de chaque nouveau fournisseur.

Dans ce segment de marché, qui n’est constitué que de quatre fournisseurs relativement petits, rien ne se fait de la même manière que dans le reste du marché de la gestion de contenu d’entreprise (les fournisseurs autres que Microsoft). Le rythme d’innovation est ahurissant. Les fournisseurs de logiciels compagnon de tenue des enregistrements n’ont pas à se soucier de l’architecture d’une plateforme de gestion de contenu d’entreprise. Ils peuvent donc consacrer toute leur énergie à la gestion des enregistrements, laissant la difficile gestion de contenu d’entreprise à Microsoft. Ils n’ont pas non plus à se préoccuper d’un dépôt, de la création de métadonnées ou même de la fonction de recherche; Microsoft fait déjà tout cela à leur place. Ils peuvent donc en toute liberté trouver de nouvelles façons innovantes d’appliquer la tenue des enregistrements aux documents. Les concurrents de la plateforme de gestion de contenu d’entreprise de Microsoft de leur côté, allouent relativement peu de ressources aux fonctionnalités de tenue des enregistrements. À l’inverse, les fournisseurs de logiciels compagnon de tenue des enregistrements ont mis sur pied des équipes de développement, de marketing et de soutien qui se consacrent exclusivement à la tenue des enregistrements.

Il existe également une autre différence fondamentale. La plupart des produits de gestion de contenu d’entreprise traditionnels sont ce que j’appelle fondés sur l’emplacement, en ce sens que les actions et les caractéristiques des documents sont déterminées par leur emplacement (dossier, librairie, etc.). C’est donc l’emplacement qui compte, et les utilisateurs doivent constamment se soucier de l’endroit où les fichiers sont stockés. SharePoint chamboule complètement cette logique, avec une approche indépendante de l’emplacement, selon laquelle l’emplacement n’a pas d’importance. Au cours des années, l’une des objections les plus pertinentes dont m’ont fait part les utilisateurs finaux était leur aversion à se faire dicter l’emplacement où ils doivent stocker leurs documents. Jusqu’à maintenant, la majeure partie (pas toutes) des solutions de gestion de contenu d’entreprise dictaient aux utilisateurs l’emplacement où ils doivent stocker les documents pour que la tenue des enregistrements puisse se faire correctement.

Et voici où la magie a vraiment fait son œuvre : les quatre fournisseurs ont finalement offert des fonctionnalités qui permettaient une véritable automatisation de la tenue des enregistrements. La tenue des enregistrements basée sur des règles est une capacité logicielle qui permet aux professionnels de la gestion des documents et de l’information d’automatiser entièrement les éléments suivants :

  • Déterminer quels documents sont des enregistrements;
  • Décider du moment où déclarer les documents comme enregistrements;
  • Classer les documents selon le calendrier de conservation (correctement cette fois!);
  • Décider du moment où classer les enregistrements dans une archive à long

Voilà l’avancée que j’attendais depuis 30 ans! Depuis trois décennies, nous dépendions des utilisateurs pour identifier quels documents sont des enregistrements et pour les classer selon un calendrier de conservation qui ne les intéresse pas et qu’ils ne comprennent pas. C’est cette dépendance envers les utilisateurs qui a freiné cette technologie depuis le tout début. À mon avis, tout cela est maintenant de l’histoire ancienne. Pour être équitable envers les fournisseurs autres que ceux liés à SharePoint, il faut dire que certains produits de gestion de contenu d’entreprise traditionnels ont intégré un certain niveau d’automatisation, mais je n’ai jamais rien vu qui s’approchait du niveau d’automatisation du segment de marché de SharePoint.

Vous pourriez en conclure que je fais davantage la promotion des solutions de SharePoint que des autres produits de gestion de contenu d’entreprise, mais vous auriez tort, une fois de plus. Les fournisseurs de plateformes de gestion de contenu d’entreprise traditionnelles disposeront toujours d’une position forte sur le marché. Microsoft est la première à admettre que son produit ne peut satisfaire les besoins de tout le monde. J’ai des clients qui ont des exigences de gestion de contenu d’entreprise que SharePoint ne peut pas respecter, et je doute que SharePoint soit un jour en mesure de le faire. L’un de mes clients utilise Documentum pour gérer des millions de dossiers de maintenance et de dessins en aéronautique – je doute fort qu’il passe à SharePoint de mon vivant! À l’heure actuelle, on observe que l’utilisation de SharePoint augmente sans cesse au sein des organisations où sont installées des plateformes de gestion de contenu d’entreprise traditionnelles autres que celle de Microsoft. Certains passeront inévitablement à SharePoint. D’autres l’ignoreront et poursuivront leurs activités sans problème. Certains essaieront de faire fonctionner ensemble les deux systèmes – SharePoint pour la création de documents, et la plateforme de gestion de contenu d’entreprise traditionnelle comme dépôt formel des enregistrements. C’est techniquement possible, mais c’est assez difficile à mettre en place et à gérer.

Tous les fournisseurs de plateformes de gestion de contenu d’entreprise autres que Microsoft ont les moyens d’offrir de bonnes fonctionnalités de tenue des enregistrements dans leur gamme de solutions. Seulement, plus de temps et de travail sont nécessaires pour atteindre cet objectif. Absolument rien n’empêche ces fournisseurs de plateforme de gestion de contenu d’entreprise de mettre en place les mêmes capacités d’automatisation de tenue des enregistrements basée sur des règles dans leurs produits, et je les y encourage fortement.

 

Où en sommes-nous?

Les systèmes de gestion de contenu d’entreprise qui offrent des fonctionnalités de tenue des enregistrements électroniques sont de nos jours mieux connus sur le marché sous le nom de SGEDD (Système de Gestion Électronique des Documents et des Dossiers). C’est donc le terme que j’utiliserai à compter de maintenant.

À l’heure actuelle, la technologie SGEDD peut être classée en trois « groupes » distincts sur le marché.

Plateformes traditionnelles de gestion de contenu d’entreprise  – Il s’agit des plateformes des fournisseurs importants de gestion de contenu d’entreprise auxquels j’ai fait référence (IBM, OpenText, etc.). Afin d’obtenir des fonctionnalités de SGEDD, il faut d’abord investir dans une plateforme de gestion de contenu d’entreprise de pointe et ensuite utiliser la composante « tenue des enregistrements » de la plateforme. En date de ce rapport, ces fournisseurs n’ont pas autant innové dans la tenue des enregistrements basée sur des règles que les fournisseurs de logiciels compagnon pour SharePoint, mais la situation pourrait changer rapidement.

SharePoint – L’achat d’un logiciel compagnon de tiers est nécessaire pour compléter SharePoint. C’est en soi un inconvénient, tous les professionnels des TI vous diront que l’intégration de produits séparés n’est jamais la meilleure option technologique. Toutefois, cette situation permet à la technologie de logiciels compagnon de tenue des enregistrements d’être plus ciblée et innovante. En prime, la tenue des enregistrements basée sur des règles permet d’automatiser la majeure partie des fonctions de tenue des enregistrements de manière à ne plus dépendre des utilisateurs finaux pour atteindre les objectifs de rendement en matière de tenue des enregistrements.

Systèmes indépendants – Les entreprises de logiciels spécialisées en tenue des enregistrements qui ciblent principalement les professionnels de la gestion des documents et de l’information et qui ont élargi leurs produits pour inclure des fonctions de gestion de contenu d’entreprise sont peu nombreuses (je ne peux en nommer que trois, peut-être quatre). Leurs produits sont centrés sur les enregistrements, ce qui signifie que les enregistrements sont au cœur de leurs produits, qui sont destinés au marché de la tenue des enregistrements. J’inclus dans ce groupe HP, avec son offre HP RM. Bien que la configuration technologie de HP RM soit semblable à celle des deux autres dans ce groupe, l’entreprise à l’origine du produit est assez différente. HP est une entreprise d’envergure mondiale, alors que les deux autres sont des acteurs occupant un créneau spécialisé et sont donc de petite taille.

 

Ces trois entreprises ont toutes leur place sur le marché, et elles continueront de fournir des solutions qui fonctionnent. Je n’ai pratiquement jamais vu une organisation choisir une technologie de gestion de contenu d’entreprise uniquement en fonction des exigences de tenue des enregistrements. Le plus souvent, le service des TI choisira soit SharePoint soit une technologie de gestion de contenu d’entreprise traditionnelle, et s’en tiendra à elle. Les fonctionnalités de tenue des enregistrements dépendent alors de ce choix technologique. Par exemple, l’entreprise qui fait le choix d’OpenText devra obligatoirement utiliser les fonctions de tenue des enregistrements d’OpenText. Faire le choix de SharePoint, en revanche, c’est pouvoir choisir entre quatre logiciels compagnon de tenue des enregistrements.

Je n’aime pas trop les solutions fournies par les systèmes indépendants que j’ai pu observer jusqu’à présent. J’ai été témoin de gens qui ont connu de grandes difficultés à faire de l’adoption par les utilisateurs finaux un succès, surtout en ce qui concerne les déploiements d’entreprises de petite et moyenne tailles (moins de 1 000 utilisateurs). Ces produits conviendront probablement mieux aux entreprises dont la culture est fortement orientée sur la tenue d’enregistrements. Je crois que plus de travail sera nécessaire pour assurer le succès de l’adoption par les utilisateurs finaux que dans les autres groupes.

Vous souvenez-vous des quatre problèmes que j’ai mentionnés précédemment qui ont plombé les projets de SGEDD jusqu’à maintenant? Eh bien, ces problèmes existent toujours. Mais alors, qu’est-ce qui a changé? C’est la technologie qui a changé, particulièrement la technologie liée à SharePoint. Grâce à la tenue des enregistrements basée sur des règles, les chances de réussite sont bien meilleures. La faible priorité accordée à la gestion des enregistrements par rapport à la plateforme de gestion de contenu d’entreprise n’est désormais plus un obstacle aussi difficile à surmonter qu’auparavant. Ce n’est pas aussi difficile que de former et d’outiller les professionnels de la gestion des documents et de l’information. Quant aux deux autres facteurs, ce sont toujours les mêmes. Le déploiement réussi d’une plateforme de gestion de contenu d’entreprise est déjà assez difficile en lui-même. Si les professionnels de la gestion des documents et de l’information et les TI ne collaborent pas, la tenue des enregistrements est encore et toujours vouée à l’échec.

 

Courriels

D’après mes estimations, les courriels représentent de 30 % à 80 % de tous les enregistrements numériques d’une organisation, et ils doivent tous être contrôlés et gérés adéquatement comme des enregistrements. Autrement dit, il y a de trois à cinq fois plus de courriels que de documents qui répondent aux critères d’un enregistrement d’affaires, et ces courriels devraient et doivent être gérés comme des enregistrements. Aucune entreprise ne peut prétendre gérer ses enregistrements électroniques si les courriels ne font pas l’objet d’un processus de contrôle des enregistrements.

Sur le plan technologique, le courriel demeure le talon d’Achille de tout projet moderne de SGEDD. Le problème réside dans le fait que la plupart des organisations utilisent la plateforme Outlook/Exchange pour leurs courriels. Cette plateforme de messagerie n’est dotée d’absolument aucune fonctionnalité de tenue des enregistrements. Elle est comme un îlot de volumes gigantesques de données stockés qui est en tous points déconnecté de la plateforme de gestion de contenu d’entreprise. Même la plateforme de gestion de contenu d’entreprise de Microsoft n’intègre pas les courriels. Dans le marché des plateformes de gestion de contenu d’entreprise, chaque fournisseur doit écrire des intégrations spécifiques entre leurs produits et Exchange afin que leurs utilisateurs puissent facilement intégrer leurs courriels à la plateforme de gestion de contenu d’entreprise pour qu’ils puissant être gérés comme des enregistrements. Dans le cas de SharePoint, encore un autre produit de tiers est nécessaire pour intégrer les courriels au logiciel. En effet, pour toute solution de SGEDD dans SharePoint, il faut faire l’acquisition de trois technologies différentes : SharePoint, le logiciel compagnon de tenue des enregistrements et un produit d’intégration des courriels.

Même si les courriels sont étroitement intégrés à la plateforme de gestion de contenu d’entreprise de manière à ce que les utilisateurs puissent facilement les incorporer dans la plateforme, le choix des courriels à intégrer demeure la décision de l’utilisateur final. Pour un utilisateur final, cela implique de décider quels courriels sont importants pour l’organisation. Ensuite, il doit suivre le processus de soumission des courriels dans la plateforme de gestion de contenu d’entreprise, ce qui implique de remplir les métadonnées obligatoires qui indiquent sur quoi porte le courriel. Tout cela vient s’ajouter à l’effort demandé à l’utilisateur final, ce qui nous ramène en arrière, dans les sombres années où nous dépendions du jugement et des efforts des utilisateurs. Nous en connaissons tous le résultat!

Il existe quelques nouvelles solutions technologiques innovantes (je ne peux en nommer qu’une avec certitude, peut- être deux) qui ont recours à des logiciels utilisant l’intelligence artificielle ou une technologie semblable pour lire les courriels de la boîte de réception et déterminer lesquels sont des enregistrements. Ce type de logiciel classe tous les courriels par probabilité qu’ils soient des enregistrements et dicte la manière dont ils doivent être classés. Est-ce que cette technologie fonctionne? Plus ou moins. Dans certains cas et avec certains types de courriels (courriels prévisibles, bien décrits), cette technologie fonctionne très bien. Dans d’autres cas, le résultat est vraiment médiocre. Dans l’ensemble, cette technologie est prometteuse, mais nous sommes encore loin d’une facilité d’utilisation courante. Je remarque également que les coûts et les frais généraux liés à cette technologie qui sont nécessaires pour soutenir ces fonctionnalités folles sont exorbitants. Je suis donc d’avis que ces fonctionnalités ne conviennent qu’à des projets bien financés et de grande envergure dotés de ressources considérables.

 

L’avenir de la tenue des enregistrements électroniques

Lorsque j’envisage l’avenir de la tenue des enregistrements électroniques, je pourrais facilement tenter de prédire un grand nombre de tendances technologiques. C’est un exercice amusant après tout. Je pourrais prédire que nous serons un jour pratiquement tous dans le nuage. Je peux certainement prédire que les fournisseurs de plateformes de gestion de contenu d’entreprise rattraperont les fournisseurs de logiciels compagnon pour SharePoint et offriront des fonctionnalités de tenue des enregistrements basée sur des règles. Mais ces prédictions ne nous aideront pas.

D’ailleurs, ce n’est pas ce que je vois dans ma boule de cristal. Ce que j’envisage pour l’avenir, c’est la chose qui m’obsède le plus ces temps-ci : l’éducation.

C’est sur l’éducation qu’il faut mettre l’accent. Beaucoup plus. Les meilleures technologies du monde ne nous serviront à rien si nous ne savons pas comment les utiliser pour atteindre nos objectifs. Nous disposons d’un grand nombre de technologies, et la technologie en est maintenant au point où nous pouvons dans une large mesure automatiser les processus de tenue des enregistrements. Dans le passé, c’est la technologie nous freinait, ce qui n’est plus le cas aujourd’hui. Désormais, ce qui nous freine, c’est nous-mêmes. Nous devons mieux aider les professionnels de la gestion des documents et de l’information à acquérir les connaissances et les compétences nécessaires pour comprendre la gestion de contenu d’entreprise à l’échelle où nous pouvons avoir une influence sur la manière dont la plateforme est configurée en vue de son déploiement. Nous devons mieux comprendre les logiciels modernes de SGEDD afin d’apprendre à faire de l’automatisation au moyen de la tenue des enregistrements basée sur des règles. Et finalement, nous devons mieux comprendre les méthodes et les techniques de gestion des projets de SGEDD, comme l’établissement de mesures de performance clé afin d’assurer le bon fonctionnement des projets.

J’ai récemment laissé entendre à une grosse firme mondiale de gestion des enregistrements que leurs projets de SGEDD n’étaient pas un grand succès et qu’un programme de formation de grande envergure était nécessaire pour renverser la situation. Et quelle a été sa réaction? « Nos clients se débrouillent très bien. Vous ne savez pas de quoi vous parlez. »

Nous possédons maintenant la technologie pour nous permettre de réussir. Pensez à un avion de ligne moderne en attente sur la piste. Sans pilote détenant la formation et les aptitudes nécessaires pour le faire voler, cette merveille de technologie n’est rien d’autre qu’un appareil coûteux et inutile. De nos jours, dans le domaine des SGEDD, beaucoup trop d’avions restent cloués au sol. Alors, qu’attendons-nous pour prendre notre envol?

 

1N.d.T. : Dans le présent document, le mot enregistrement est utilisé pour traduire le mot « record », l’expression gestion des enregistrements traduit « records management » et l’expression tenue des enregistrements traduit « recordkeeping ».
2Au moment de la rédaction du présent document, le SGDDI est désormais connu sous le nom de GCDocs.

2016 Edition

Meet the Authors

Uta Fox is the Records and Information Manager in the Records and Information Management Section of the Calgary Police Service. She holds a Master’s Degree from the University of Calgary and is a Certified Records Manager. She co-authored several ARMA International publications and sits on its Content Editorial Board. She is ARMA Canada’s Director of Canadian Content.

Christine Ardern, CRM, IGP, is a Past President and Fellow of ARMA International and Emmett Leahy  award winner. She has been involved in information management and archives planning, development  and implementation internationally. Christine taught at the University of Toronto’s iSchool Institute and  presented at ARMA seminars and workshops.

John Bolton, (CRM-Retired), has had over 26 years of RIM experience. His career included special librarianship, project management and 20 years with the Provincial Government of British Columbia working in the IM/IT field. He has spoken at numerous conferences and has published articles on records management, privacy, and standards.

Alexandra (Sandie) Bradley has been a records and information manager for 32 years.  Through her  chapter, regional and International roles within ARMA, she  been a mentor and teacher, researcher,  writer  and advocate for our profession.  She is a Certified Records Manager and was made a Fellow of  ARMA  International  (Number 47) in 2012.

Jim Coulson is Managing Director, Morae Legal, with over forty years’ experience, including founding two global RIM consulting firms and building Huron Consulting’s global RIM practice, as Managing Director. A Certified Records Manager, Jim is a Fellow of ARMA International and received the Emmett Leahy Award, considered the highest global award in the profession.

In practice for over 20 years, Stuart Rennie’s Vancouver-based boutique law practice specializes in  records management, privacy and freedom of  information, law reform, public policy and information  governance law. In 2015, Stuart was awarded the ARMA Canada Region Member Of The Year Award  and Distinguished Member Award.

Canadian RIM, an ARMA Canada Publication Spring 2016 Volume I, Issue I

 

Introduction

Welcome to the inaugural issue of Canadian RIM*, an ARMA Canada Publication!

ARMA Canada is thrilled to provide Canadian records and information management (RIM) and information governance (IG) practitioners with a resource focusing on Canadian issues in RIM and IG. Searching for recordkeeping information on Canadian laws, regulations, governance, standards, history, privacy, practices and ethics has just become much easier with access to this new publication.

Canadian RIM, an ARMA Canada Publication is a hybrid approach to Canadian RIM and IG that includes the practical, the theoretical and challenging approaches to the RIM-IG status quo. Articles will, for the most part, be by Canadian (but not exclusive to Canadian) practitioners, experts, and scholars addressing Canadian RIM topics from not only pragmatic approaches such as the how-to-do an aspect of RIM-IG, lessons learned and success stories, but also scholarly research-based RIM-IG articles and opinions that may query the status quo.

And while we are most eager to promote Canadian content and it is our primary focus, it will not be exclusively Canadian content as we want this publication to provide an environment for topics in RIM-IG that have wide-range impact or consequences on our industries. We welcome non-Canadians as well as Canadian contributors to forward articles that will further the RIM-IG fields. For further information on contributing articles (all articles will be peer-reviewed) to

Canadian RIM, an ARMA Canada Publication, visit the Canadian RIM webpage on the ARMA Canada website.

This first issue of Canadian RIM, an ARMA Canada Publication features seven articles.

Have you considered the origin of IG? Christine Ardern discusses RIM and IG within the Canadian context in her article “From Records Management to Information Governance: A Look back at the Evolution” which traces the evolution of RIM to IG and the Canadian role within that evolution. This article is also available in French.

One of the focuses in John Bolton’s article, “A Content Analysis of Information Impact: Professionalism or Not – A Critical Twenty-Five Review “ is a question that RIM practitioners have been asking on and off for many years; is records management a profession? He delves more deeply into that question by reviewing ARMA International’s Information Management (previously entitled Information Management Journal, and Records Management Quarterly) articles.

As records managers and practitioners know, implementing a records management program can be overwhelming in most organization. Alexandra (Sandie) Bradley’s article on “Recordkeeping Issues in First Nations Governments” considers the multitude of challenges that First Nations must embrace to establish such a program.

In “Dispelling Myths about Records Retention in Canada” Stuart Rennie discusses some of the prevailing myths surrounding records retention and he then provides the legal background and basis that refutes the myths including the impact of provincial legislation on some record series.

Another of Canadian RIM, an ARMA Canada Publication’s goals is showcasing the history of Canadian records and information management. Three articles address this topic. Jim Coulson’s article, “Some Personal Reflections on the History of RIM in Canada” highlights a number of impressive Canadian accomplishments in this industry.

One of Uta Fox ‘s articles, “The History of Records Management in Canada, 1867-1967” has appeared on the ARMA Canada website (accessible in the Resource section) but now with Canadian RIM, an ARMA Canada Publication, ARMA Canada is consolidating original RIM resources into this publication.

Uta’s other contribution “The Failure of the Red Deer Industrial School,” her master’s thesis, was written in 1993. Due diligence of past recordkeepers ensured that this institution’s correspondence, reports and directives were preserved which made this thesis possible. It is linked to the Truth and Reconciliation Commission of Canada’s website.

We also want to hear from you and invite you to share your thoughts, suggestions and opinions. As we have noted, this is your publication. You have experiences that need to be shared and this publication is your conduit. For further information please contact: armacanadacancondirector@gmail.com.

Uta Fox, CRM
ARMA Canada
Director of Canadian Content

*The name “Canadian RIM,” is currently a working title. ARMA Canada will be holding a competition to find another name that best reflects the spirit of this publication.

 

DISCLAIMER

The contents of material published on the ARMA Canada website are for general information purposes only and are not intended to provide legal advice or opinion of any kind. The contents of this publication should not be relied upon. The contents of this publication should not be seen as a substitute for obtaining competent legal counsel or advice or other professional advice. If legal advice or counsel or other professional advice is required, the services of a competent professional person should be sought.

While ARMA Canada has made reasonable efforts to ensure that the contents of this publication are accurate, ARMA Canada does not warrant or guarantee the accuracy, currency or completeness of the contents of this publication. Opinions of authors of material published on the ARMA Canada website are not an endorsement by ARMA Canada or ARMA International of those opinions and do not necessarily reflect the opinion or policy of ARMA Canada or ARMA International.

ARMA Canada expressly disclaims all representations, warranties, conditions and endorsements. In no event shall ARMA Canada, its directors, agents, consultants or employees be liable for any loss, damages or costs whatsoever, including (without limiting the generality of the foregoing) any direct, indirect, punitive, special, exemplary or consequential damages arising from, or in connection to, any use of any of the contents of this publication.

Material published on the ARMA Canada website may contain links to other websites. These links to other websites not under the control of ARMA Canada and are merely provided solely for the convenience of users. ARMA Canada assumes no responsibility or guarantee for the accuracy or legality of material published on these other websites. ARMA Canada does not endorse these other websites or the material published there.

From Records Management to Information Governance: A Look Back at The Evolution

 

1.0 Introduction

Christine Ardern, The Information Management Specialists, Feb. 2016

When I started to write this article I had a particular objective in mind, as a result of my own experiences in records and information management from 1975 to 2015. And like our world, that changed. When I consider that I have been involved in Archives and Records Management for 40 years, I decided that my career, through a variety of organizations, had exposed me to all aspects on information governance, albeit under different names. And I have a strong belief that it helps to understand where we have come from, to put some perspective into where we are going.

This is not a comprehensive, all-inclusive study. Rather it is an overview, from a personal perspective, of how we have evolved from records management to information governance and describes some Canadian initiatives which have both influenced and impacted that evolution. It’s fair to say that I was fortunate in working in organizations that varied in their records issues and I participated in everything from strategy development to records centre operations to building an archival facility and playing internal RIM consultant to one of Canada’s largest financial institutions. Needless to say they exposed me to many different learning opportunities and facets of what now falls under the umbrella of information governance.

I have attended a number of ARMA International Conferences over the years (the first one in Toronto in 1975!!) and heard speakers discuss many different topics. One of the presentations that struck me the most was at the 2005 ARMA International Conference, in Chicago, when Daniel Burrus, the keynote speaker, talked about the future being visible in his presentation, Future View, A Look Ahead. One of his key themes is articulated in this statement:

You need to visit a place that I call the visible future™. It is a place you can clearly see, but you have to take the time to look. Most of us never take the time to look. The visible future is the fully predictable future. The more you look and ponder the future that you know is coming, the more you can capitalize on that future.

In 1995, at an ARMA International conference in New Zealand, one of the speakers was demonstrating what would happen to telephones in the future, based on research being done in the field. It looked totally impossible….seeing the people we were talking to??? And where are we today? It was the visible future! Daniel Burrus’ point was that if we pay attention to innovation and research being done in such places as MIT or CERN, these organizations give us a look into what is coming down the road.

If we look back at how we got to where we are today, it is not difficult to see that the evolution of technology created the big data environment we now live. It is hard to imagine a world without email and the world-wide web. But 50 years ago, that was the reality. How did technology change our workplace and lead to the era of information governance? Did it suddenly happen or was it an evolutionary process? Look at these key dates1….innovation that just kept going and it continues today.

  • 1930’s – mainframe computer is created.
  • 1969 – ARPANET – predecessor of the internet first created.
  • 1976 – Apple 1 home computer is created.
  • 1979 – first cell phone network created in Japan.
  • 1981 – first IBM PC sold to the public.
  • 1989 – Tim Berners-Lee and Robert Cailliau build the prototype of the World Wide Web at CERN, the European Organization for Nuclear Research.
  • 1994 – American Government releases control of the internet and the World Wide Web is born -just over 20 years ago. And how did that impact the workplace?

What changed and why now? The early 1990s saw the introduction of personal computers in the office, bringing with them the idea that records management would no longer be required. Computers could do it all. At that point, the web was in its infancy and used to transmit emails. Organizations were still sending letters and reports through inter-office mail and via Canada Post. Pre the internet, information generated through computers was managed internally and protected by IT departments. It was some time before information was shared and transmitted electronically between departments, business units in different locations and in different countries. The Internet changed everything.

 

And Now We Have Big Data

And big data didn’t just happen! People have been writing about the volumes of data being created and stored since the mid 1940’s2.

Think about your organization – how is information received, used, transmitted and where is it stored? It’s the same with your personal life. How many different devices does YOUR home have? How many different ways are there today to create and share information with each other? The following list gives examples of why data volumes are increasing daily. And if it supports your business, then it needs to be managed.

  • The internet and wireless transmissions allow us to create and share information through text messages, blogs, Twitter, Instagram, Facebook and a myriad of other social media connections.
  • Smart devices communicate with each other. Hydro and gas companies have smart metres; OnStar can provide updates on your car’s maintenance status from data it has captured on the car’s computer system. GPS Locators on your phones, iPad, etc. can tell where you are. All that data is being collected by the organizations that are tracking it.
  • Large organizations such as financial institutions and insurance companies capture huge volumes of transactional data daily as customers do their banking online, through bank machines or any other technology that financial institutions have provided to interface with clients.
  • In addition to all the structured data in systems we are still faced with the unstructured data generated by employees in their day to business in network directories, emails and system applications that support content management, etc.

In 2007 EMC and IDC3 published their first study on the Digital Universe in an attempt to project the growth of data creation as a result of the World Wide Web. In its 2014 report the projection is that.

By 2020 the digital universe – the data we create and copy annually – will reach 44 zettabytes, or 44 trillion gigabytes4.

And the old adage, storage is cheap, keep everything forever has now come back to bite IT departments who find themselves with terabytes and more of data that is old and even inaccessible. There are costs of managing electronic data over time as software changes and is no longer supported. Disposal, as part of an overall business activity, is a necessity in reducing costs and risks and improving efficiency in any organization. And if you cannot find the right information when the judge asks for it, it can lead to out of court settlements in the millions of dollars in today’s litigious environment.

 

How Old Is Information Governance?

Information has been around for a long time, whether it was referred to as non-record, transitory, or publications and databases. For years, organizations relied on paper as “records” (and still do) as the major source of evidence of transactions and business decisions. With the advent of technology and the proliferation of trans-border dataflow, issues such as privacy, information/data ownership, and security all started to take on a new meaning with the realisation that electronic information assets are much easier to access.

Information, like records, is stored in so many different places as organizations move to serving clients though social media and the Cloud. It is created every day and still needs to be managed, used, stored and disposed of in accordance with business and regulatory requirements. As more and more records, data and information are stored electronically the practices that were applied to paper are now required in the electronic world and are being built upon to embrace this new workplace.

Many of us are members of North American organizations and articles we read are often presented from that perspective. How long has “information governance” been on the radar? At the 2015 Information Governance Initiative (IGI) conference in Hartford Connecticut, the question was posed to the audience….5 years? 10 years? More than 10 years? The majority of the audience responded to number one – 5 years. From the perspective of what we mean by information governance, the actual answer is “longer than 15 years”, driven initially by the need to ensure privacy of personal information.

Privacy, security, records retention and disposition are needed, regardless of what medium records, information, data are created and stored. Organizations now recognize the need to bring together what have previously been siloed departments and groups as cross-functional teams, to address the issues from an organization-wide perspective, because of technology and information transmission and exchange globally. Policies and procedures for each component cannot be drafted in isolation based on a particular area of interest, such as IT, Privacy, Business, Legal and Risk and Information Management. The requirements overlap and have to be addressed as part of the whole. And an information governance framework is now where the pieces come together. Components of IG have been in place for a long time, albeit in separate departments. So how is it defined today?

 

Information Governance: Some Perspectives

As information governance has evolved, the definition has varied, depending on the particular source and perspective. There are, however, consistent themes across all the definitions.

The National Health Service in the UK has created an Information Governance Toolkit5 to ensure that the information collected as part of the overall operations of the health system in the United Kingdom is managed in accordance with the Caldicott principals (discussed later in this paper). It states that:

  •  Information governance is to do with the way organisations ‘process’ or handle information. It covers personal information, i.e. that relating to patients/service users and employees, and corporate information, e.g. financial and accounting records.
  • Information governance provides a way for employees to deal consistently with the many different rules about how information is handled…

 

Gartner, Inc

Founded in 1979, Gartner is well known for its research and reports that are widely referenced in the IT and information management communities. In 2007, Gartner identified information governance as being a “top of mind issue” for its clients describing it as:

  • The specification of decision rights and an accountability framework to ensure appropriate behaviour in the valuation, creation, storage, use, archiving6 and deletion of information. It includes the processes, roles, standards and metrics to ensure the effective and efficient use of information in enabling an organization to achieve its goals.

 

The Sedona Conference7

In the Sedona Conference Journal, Volume 15, Fall, 2014, an article titled The Sedona Conference Commentary on Information Governance proposed the following definition:

  • (Information Governance) means an organization’s coordinated, inter-disciplinary approach to satisfying information compliance requirements and managing information risks while optimizing information value. As such, information governance encompasses and reconciles the various legal and compliance requirements and risks addressed by different information-focused disciplines, such as records and information management (“RIM”), data privacy, information security, and e-discovery. Understanding the objectives of these disciplines allows functional overlap to be leveraged (if synergistic); coordinated (if operating in parallel); or reconciled (if in conflict)

 

The Information Governance Initiative (IGI)

The Information Governance Initiative (IGI), established in 2013, is a cross-disciplinary consortium and think tank dedicated to advancing the adoption of information governance practices and technologies through research, publishing, advocacy and peer-to-peer networking. In its 2015-2016 Annual report IGI presented its definition of and set of components for information governance based on its IGI community input:

Information governance is the activities and technologies that organizations employ to maximize the value of their information while minimizing associated risks and costs.

 

The Components of Information Governance8

In its Annual Report 2015-2016, the IGI presented its findings from a research survey sent out to its IG community members asking them to highlight which of twenty-two activities they felt fit into the domain of information governance. The following graphic reflects the results of the survey:

A number of these components, consolidated, would fall under the information governance Reference Model, as defined in a 2011 whitepaper – How the Information Governance Reference Model (IGRM) Complements ARMA International’s Generally Accepted Recordkeeping Principles (GARP®)9:

  • The IGRM supports ARMA International’s GARP® Principles by identifying the cross-functional groups of key information governance stakeholders and by depicting their intersecting objectives for the organization.

The Information Governance Reference Model depicts areas who have interest in the organization’s information assets and which require collaboration in today’s environment, with RIM as one element in the cross-functional approach.

The reality is that depending on where you practice records management, what your responsibilities are or the type of organization in which you work, your exposure to the components of information governance will have varied. Someone working in a school board environment in Canada for the past number of years could have had responsibility for privacy, records management and compliance. In a financial institution you may have been the records manager who had to ensure that security, retention and privacy requirements were built into a system design process.

 

A Look in the Mirror

The Evolution from Records Management to Information Management: 1960 – 2000

To understand the transition from records management to information governance, it is helpful to look at how each has been and is being defined. In 1969, Bill Benedon10, considered one of records management’s luminaries, published Records Management11, devoted to, as the title implies the component of records management program design and implementation. Benedon writes:

Records Management is a term well chosen for covering information processing activities now and in the future. New innovations, such as magnetic tapes and other forms of miniaturized documentation, while changing the complexion of the record, still present the same problems of retention, storage, forms design, reporting needs, protection and of course, the oldest of all, filing requirements, now referred to in a much more fanciful manner – retrieval.

He goes on to add “it will be quite some time before accounting and auditing people are willing to say that once you have your information in machinable form, you can throw away the source document”.

Close to 50 years later we are now dealing with having seen the source document changing from slowly to paper to digital….and we need to start disposing of the digital format!

1981 saw the publication of the Second Edition of Information and Records Management by Maedke, Robek and Brown, a book which became the basis of many records and information management course curricula. It defined records management as:

  • The application of systematic and scientific control to the recorded information that is required in the operation of an organization’s business. Such control is exercised over the creation, distribution, utilization, retention, storage, retrieval, protection, preservation and final disposition of all types of records within an organization.

In 2001, the first International records management standard (ISO 15489) issued through the International organization for Standards (ISO) defined records management as:

  • The field of management responsible for the efficient and systematic control of the creation, receipt, maintenance, use and disposition of records, including processes for capturing and maintaining evidence of and information about business activities and transactions in the form of records.

The common element in all the definitions was the need for a systematic approach to managing records and recorded information. So what has changed? Did records disappear or were they absorbed into information management as technology began to emerge in the workplace?

 

Records and Information Co-exist

Information has always been created on many different media and in a variety of formats as have records. In the paper world, we had records and non-records or transitory records (information!). Convenience copies, drafts of reports, reference and research reports were all created in the day to day business of the organization, had different retention periods and were disposed of in accordance with the agreed upon policies. However they were not records because they did not provide evidence of decisions made by the organization.

Before personal computers, the internet and social media, databases stored data and generated printed output from those databases. Financial reports, payroll summaries, inventory listings, etc. were considered records and subject to corporate retention periods. From an IT perspective retention was focused on the period of time for which data was kept on servers or on tapes, rather than on the information content.

As more and more information was created at the desktop by knowledge workers12 and data was stored in systems rather than printed to paper, legislation changed to allow for electronic transactions in place of paper. The internet impacted everyone, less time was spent printing and filing and more information was left in shared files on networks, in legacy systems, on backup tapes, etc. And so began the CIO’s nightmare. Not only was more information being created electronically but what about all those old systems that contained obsolete data and all those old tapes in the data centre that no-one had indexed? How could data be disposed of?

 

Addressing Data in Systems: The Archivists Started It!

Machine Readable Archives Division: Public Archives of Canada.

I was first introduced to the concept of electronic records as an Archivist at the Toronto Harbour Commission. During a visit to the Public Archives of Canada in the mid-1970’s to learn about records centre operations, records management policy and procedure development and various aspects of archival operations, I met with staff in the Machine Readable Archives Division. Archivists were dealing with challenges of managing and ensuring the long-term preservation of data in systems many years before records managers and IT departments and Canada was among the leaders in dealing with machine readable records. In his article FOCUS: The Machine Readable Archives Division of the Public Archives of Canada (Archivaria, 1978 (176-180)) Harold Naugler reported that:

“As part of the Federal Government’s EDP records management program…the Office of Records Management Services of the Public Archives of Canada undertook in 1976-77 an inventory of machine readable records in some sixty-seven government departments.

Naugler further stated that:

“There are vast amounts of information in machine readable form covering such varied aspects of national life as employment, crime, disease, immigration, emigration, climate, geology, food production and consumption, housing, transportation, communication, and the cost of living.

The methodology used in the inventory process was subsequently developed into a set of guidelines created by the Machine Readable Archives (MRA) Division13 which later became part of the government’s Guide on EDP Administration.

In 1984, recognizing digital preservation as a global issue, Harold Naugler authored The Archival Appraisal of Machine-readable Records: a RAMP study with guidelines published under UNESCO’s General Information Programme and UNISIST. At that time it was viewed as one of the leading publications addressing electronic data management.

Groups such as the Association of Canadian Archivists, the Society of American Archivists and the International Council of Archives were appraising records and data in systems for long term preservation. Did managing data in systems need someone to understand what the systems were and what data was created, apply retention requirements and ensure that the formats and storage media were appropriately preserved? Yes. Did we call it information governance? No. It was a piece of the puzzle.

 

RM becomes IM becomes IG: Revolution or Evolution?

In 1989 when I joined CIBC as the Manager of Archives and Records Management, I was part of the Corporate Governance Group, the components of which reflected the current view of information governance. The Corporate Governance Group consisted of individuals involved in Records Management and Archives, Legal, Compliance, Privacy, Audit and the Corporate Security teams. The Records Management group collaborated with other members of the department, depending on the specific projects. The issues were cross functional and too big for one group to address. The Records Management Group worked with the IT group to incorporate records retention into the Application Development Lifecycle and software selection included members of the RIM team. The RIM group were internal consultants and participated in decisions about electronic records and transactional data, hardware and software selection.

Many articles and publications which reference the beginning of the concept of information governance point to the National Health Services (NHS) in Great Britain as one of the leaders in establishing an information governance framework. In 1997, as a result of concerns over privacy and the impact of technology on patient data, Dame Fiona Caldicott chaired a panel to look at patient identifiable data and how patient information was handled across the NHS. As a result of that review and subsequent work, seven principles, known as the Caldicott Principles14 were created. Those Principles continue to be used today to assess whether or not information containing patient information is being managed and protected properly. Information governance has now become the mainstay of the NHS and is supported by toolkits, policies and procedures and guidelines.

The level of involvement RIM professionals have had with some or all of the components of information governance has been driven by the type of organization in which they work. Someone working in a small municipality as the City Clerk may find themselves responsible for RIM, Privacy and Legal. In a large financial institution, the RIM professional may be part of a cross-functional team where individuals are responsible for the IG components. As a RIM professional in a law firm, eDiscovery might be the driving factor behind IG for clients. The regulatory environment under which the organization operates may create a stronger need for a focus on different IG elements but in the final analysis a successful IG program is the sum of its parts.

 

Information Management in the Government of Canada

Information management and records management have coexisted for many years in the Federal Government of Canada. The Government of Canada’s Information Management program grew out of the 1989 issuance of the Management of Government Information Holdings Policy, an initiative of the Treasury Board Secretariat, which “consolidated existing policies on records management, information collection and public opinion research, micrographics, EDP records management and forms management”.

In 199515, Treasury Board Secretariat issued guidelines on Managing Government information and included a model which showed the lifecycle of the information holdings stating that information management programs were responsible for:

  1. Planning
  2. Collection, creation, receipt
  3. Organization, transmission, use and retrieval
  4. Storage, protection and retention
  5. Disposition through transfer or destruction

At that time, a 55-page user manual supported the overall implementation of information management in Federal Government departments. As the workplace has changed, so too has the policy title – 2003 saw the creation of the Policy on the Management of Government Information which was replaced in 2007 by the Policy on Information Management.

To explain the relationship between records management and information management in the Government of Canada would be a study unto itself, requiring a detailed look at the policy development roles and responsibilities of the Public Archives of Canada (later to become the National Archives and now Library and Archives Canada) and the Treasury Board Secretariat. What was previously known as records management within the Federal Government has now become Recordkeeping Practices under the Directive on Recordkeeping, first issued in 2009:

Recordkeeping is a resource management function through which information resources of business value are created, acquired, captured, managed in departmental repositories and used as a strategic asset to support effective decision making and facilitate ongoing operations and the delivery of programs and services.

What has changed? Information resources are created across every organization. Some need to be kept to meet legal and compliance requirements and others can be disposed of. In the case of the Federal Government’s recordkeeping policy, records have now become information resources of business value. Can we project what will be next?

 

The Government of Alberta Information Management Program

Many records management programs were created as part of a facilities management function, since paper was transferred to a records centre at the end of its active phase in the office. Typically, the perception was that managing records was a physical activity in a storage warehouse. As technology was introduced into the workplace, the perception of records management continued to focus on paper! The fact that computers were now creating and storing data and information, some of which were records, was difficult accept, as records management continued to be viewed as paper-based supported through records centre operations. Organizations began to transition from records management to information management as computers became more and more prevalent.

How was the information to be managed? In the paper world it had been defined as non-record or transitory records. In the electronic world it was all captured and stored together as a set of ones and zeroes. And it was necessary to change the concept of records to address the changing needs of the workplace. It was the content that mattered, not the medium on which it was stored, and the legal environment began to address the electronic workplace through such legislation as Electronic Transactions Acts.

In Canada, one of the leading government information management programs was that of the Government of Alberta whose Information and Technology Strategy was adopted by the Deputy Ministers’ committee in 2001. The main reason for the transition from records to information was that “records” were perceived as paper only and the workplace was moving towards a much broader context of electronically stored information, some of which was identified as a record.

Recognizing the need to manage the Government’s information assets, including its records, the then Director of Information Management, Sue Kessler, working closely with Dr. Mark Vale,16created an Information Management Framework17, which from today’s perspective covers the majority of the pieces as defined in an information governance model.

The Government of Alberta’s Information Management guidance and resources were, and continue to be, used not only by the Government of Alberta’s departments but by both the private sector and other governments, to transition from records management to information management.

As records management has evolved to information management to information governance, the scope has expanded even though the fundamental activities required for program management have not changed. Regardless of what we are managing, we still require:

  • Strategies and frameworks
  • Policies and procedures
  • Standards and guidelines
  • People,
  • Processes and
  • Technology

The medium through with the records and information are transmitted is changing and the volume is increasing daily but the concept of the lifecycle or continuum still exists and the need for organizations to be accountable and compliant has not gone away.

 

Data Protection and Privacy

Early Drivers towards Privacy Controls: OECD Model Privacy Guidelines18

The need to provide oversight to data in systems has been of concern for some time in international organizations such as the Organization for Economic Cooperation and Development (OECD), an organization which Canada joined in 1984. In 1980, the OECD created its Guidelines on the Protection of Privacy and Trans-border Flows of Personal Data stating that:

The development of automatic data processing, which enables vast quantities of data to be transmitted within seconds across national frontiers, and indeed across continents, has made it necessary to consider privacy protection in relation to personal data.

The OECD was concerned that the amount of personal data being captured through large data processing systems would put privacy at risk and encouraged the member countries to create their own national data protection and privacy legislation, which Canada did in 1983.

With technology, not only was privacy a concern but also data could now be easily transmitted across international borders, raising concerns about country specific legislation in such areas as data ownership. Local laws around data ownership could limit the ability to share information within an organization doing business globally. Industries such as banking and insurance were particularly concerned about restrictions on trans-border data flow created by country-specific laws, given the nature of their international businesses.

The OECD model guidelines were intended to support harmonization of privacy laws while supporting trans-border data flows as technology changed the way data was transmitted and shared. In celebrating the OECD Guidelines 30 year anniversary in 2011, OECD stated that:

The stand-alone technologies of the 1970s have become a ubiquitous, integrated global infrastructure. Occasional global data flows have given way to a “continuous, multipoint global flow,” highlighting the need for privacy enforcement authorities around the world to work together to develop globally effective approaches to protecting privacy. Advances in analytics and the monetisation of our digital footprints raise challenging questions about the concept of personal information and the appropriate scope for the application of privacy protections.

Canada, driven by the need to comply with the OECD guidelines as a result of being a member, created Privacy laws at the national and provincial levels, together with a strong privacy infrastructure in government departments. Recognizing the need for access to information, alongside data protection and privacy, the Federal government enacted two separate laws in 1983: The Access to Information Act and The Privacy Act19. As more and more information was captured in databases and systems across government departments, policy frameworks, as discussed earlier in the paper, incorporated the necessary security and controls to ensure that personal information was appropriately collected and managed.

How did the legislation impact non-government departments? Why would non-government records managers have to be aware of the Access to Information Act? Any organization which was required to send records to the Federal Government, for example, the Salvation Army20, had records in Canadian Government departments and as Access to Information legislation was being enacted the Records Managers in non-government organizations had to be aware of what records were shared with the Government of Canada and the measures in place to control access to and manage the privacy of third party records as part of their overall RM activities.

 

Being Proactive: Building Privacy into Technology

Privacy by Design21

As more and more information was collected and stored in systems, the need to ensure that privacy was protected to meet the requirements of the various federal and provincial privacy acts resulted in the development of privacy guidance and controls.

Ontario has played a leading role in the privacy domain both in Canada and internationally as a result of the work of Dr. Ann Cavoukian, who served as Information and Privacy Commissioner (IPC) of Ontario from 1997 to 2014. Dr. Cavoukian believed (and still believes strongly) that rather than wait until privacy became a problem in technology, it was necessary to build privacy protection methods into the overall technology development and design and outlined her position in her 1995 paper: Privacy-enhancing Technologies – A Path to Anonymity22, written in conjunction with the Netherlands Data Protection Authority. Committed to ensuring that privacy is an integral part of everyday business practice, Dr. Cavoukian created Privacy by Design which incorporates the following seven principles to be applied to privacy and technology:

  • Proactive not reactive; preventive not remedial
  • Privacy as the default setting
  • Privacy embedded into design
  • Full functionality: positive-sum, not zero-sum
  • End-to-end security: full lifecycle protection
  • Visibility and transparency: keep it open
  • Respect for user privacy: keep it user-centric

The Privacy by Design framework was adopted as an international framework for privacy and data protection in 2010.

Dr. Cavoukian’s work on Privacy by Design continues today at Ryerson University in Toronto where she is the Executive Director of the Privacy and Big Data Institute. Her work has led to the creation of a program, in partnership with Deloitte Canada, against which companies and organizations, which have embedded privacy into their day to day operations and comply with the 7 Privacy by Design Principles, can be certified.

 

Information and Data Security

For many of us who started in records management before the impact of technology, security referred to locked offices, filing cabinets and work spaces. Security classifications were determined by the importance of information to the organization whether on paper or in electronic form. The most common designations included confidential, restricted, internal use and public. Organizations might have additional designations depending on the information, such as top secret and secret in government organizations.

Before the implementation of systems security controls, access and retrieval rights to online information and data were controlled through designated individuals, to ensure that appropriate persons were given access to the right documents, depending on what permissions they had.

Since the majority of the information resources were managed within the organization, the physical security issues were more easily controlled. As organizations have moved to automated systems, the Cloud and the internet, the need for security controls has increased and the ways to implement the security controls have changed.

 

Today’s Real Threats

We all hear about security breaches. As the Records and Information Manager how involved are we in knowing where those risks to our information assets exist?

The issue of information security is not new. Recognizing it as a key element of access and privacy, in 1986 the Treasury Board Secretariat introduced the Government Security Policy intended to: “ensure that all classified and designated information or assets of the federal government are safeguarded in an appropriate manner”.

As the use of technology has expanded into the internet of things, BYOD, Social Media and the Cloud, information security becomes another element of the information governance framework. International standards, such as ISO 27002: Information Technology — Security Techniques — Code of Practice for Information Security Management Standard23 (just one of a number of standards that support information management and information security, listed in Appendix “A”). Security standards are being created to identify security control measures, practices, which include procedures or mechanisms that may:

  • protect against a threat,
  • reduce a vulnerability,
  • limit the effect of an unwanted incident,
  • detect unwanted incidents, and
  • facilitate recovery.

Cyber-crime has become the number one issue that governments are now addressing. And whether the targets are governments, companies or individuals, the threat can have far-reaching implications. We hear about these breaches on the news and wonder what the impact is.

Recently TalkTalk, a telephone and broadband supplier in the UK was hacked, with the hackers gaining access to TalkTalk’s client account information. While on vacation in the UK, the day after the breach was reported, we overheard a conversation in the pub telling the bar tender that his bank had called and told him that someone had gone into his bank account and attempted to clear out his money, based on information in his TalkTalk account. What was interesting about this particular case was that the company had had an audit about two years prior to this event, in which the lack of system security was flagged and nothing was done to upgrade it.

On January 3, 2016, this note was on TalkTalk’s website:

Welcome to TalkTalk

We’re currently making security enhancements to our site, which should be back online soon.
While we do this, our customer team are there to help you with details on the packages below or upgrades. Just give them a call.

It is no longer sufficient to assume that information security is the IT department’s responsibility. Information security, privacy and disposition problems are crossing boundaries between departments which have responsibility for identifying and protecting that corporate information. Being able to understand the concepts and information security issues is critical to RIM within today’s workplace.

 

RM Meets Technology

In my first job as Archivist at the Toronto Harbour Commission, I was responsible for purchasing a Wang word processor and learning how it functioned. That was 1978 – and some time before personal computers were adopted as a standard technology in all organizations. At the AGO in 1984, as the Manager of Administration and Archives, I was involved in the overall technology strategy because of my role in Archives and Records Management.

In 1989, as Manager of Archives and Records Management at CIBC, I worked with RIM staff to select and implement a records management software package designed to support the storage and disposition of about 400,000 boxes in the Toronto Records Centre. In addition to selecting software for our own records management purposes, the team were involved in the selection of an imaging system and worked with IT to build retention requirements into the system development lifecycle. Understanding what was happening with technology was not a nice to do…it was a must do.

For many years, the mantra “storage is cheap” could be heard in many organizations as more and more information was created electronically and IT “retention” was focused on moving live data to tiered storage, not on the value of the data to the organization. Managing data was complicated and therefore something that didn’t happen, until concerns were raised about legacy systems and Y2K. At that point, organizations were still for the most part, managing “records” on paper and data was retained for disaster recovery, back up and security purposes. Given the transition from paper records to data and information in systems and the lack of retention applied to legacy systems and backups, IT organizations suddenly found themselves with terabytes of stored data that could not be disposed of because no-one knew what it was. What was the risk of deleting without any awareness of what was in those systems? How could you justify that in court? On the other hand, all those old tapes sitting in data centres, uncatalogued, not cared for in terms of preservation methods – rewinding, migrating, etc. could be a huge liability to eDiscovery. Being at the table for those discussions was part of the RIM responsibility.

 

Defining Software Requirements: The Canadian Influence

Canada and Canadians have played a leading role in designing specifications for records management software as well as creating software products, based on those specifications. Early work in Canada began in 1983 through an initiative of the Department of Communications and the National Archives of Canada: The Office Communications Systems Field Trial Program. Designed to study how 70 users, linked together through a local area network, created, used and disposed of electronic information, it provided important research data for ongoing solution development. Its work resulted in the creation of the Information Management Office Systems Advancement (IMOSA) project, a joint initiative between the National Archives of Canada, the Department of Communications Canadian Workplace Automation Research Centre and Provenance systems24. In 1990 as a result of the IMOSA project work, the National Archives published a set of functional requirements for software to manage electronic information in the Federal Government known as FOREMOST (Formal Records Management for Office Systems Technologies).

At the time the Canadian requirements were developed, the electronic records community within the International Council on Archives was collaborating on software requirements. While it would be difficult to say that Canada was number one in creating the requirements, it would be fair to say that the work undertaken through the IMOSA project was on the leading edge of defining electronic records management software requirements. A number of software requirements initiatives were subsequently developed in Australia, the US and Europe and continue to be enhanced today.

 

Creating Electronic Records Software Solutions

Canada, again, has played a significant role in electronic records management software development. As a result of his work with the Canadian Federal Government, Bruce Miller established Provenance Systems in 1989, and created FOREMOST, an electronic records management software package which was sold to (EMC) Documentum in 2002 forming an integral part of the Documentum Records Management product. Bruce subsequently created Tarian software as the next generation electronic records solution, the Tarian eRecord Engine, which in 2002, was acquired by IBM to become IBM Records Manager.

Records Managers globally are familiar with OpenText and its LiveLink product suite. Its evolution from its beginnings to where it is today is a great study in Canadian innovation and development. In his Forward to Open Text Corporation: Ten Years of Innovation25, Tom Jenkins, then CEO of Open Text Corporation wrote:

It’s hard to imagine today, but Open Text Corporation started out in 1991 as a small three-person consulting operation, a spin-off of the University of Waterloo.

With the Internet in its infancy OpenText was responsible for creating one of the first search engines for Netscape and Yahoo. So how did they get into records management – evolution or revolution?

Before OpenText made its foray into search engines, the Canadian Federal Government departments were dealing with physical file management challenges and in 1986 a group of Ottawa-based entrepreneurs saw an opportunity to fill the gap creating iRIMS a PSSoftware product. As a result of the changing environment iRIMS expanded its product line to address electronic, physical and image-based records. In 1999 it was purchased by OpenText and its functionality integrated into the suite of products which continue to evolve today.

And where exactly are we today? As technology has evolved, so too has the functionality of these products. Records management functionality has been built into a number of products dealing with an organization’s information assets. Whether the company calls it records management, information management or information governance, at the end of the day, these software tools help manage the lifecycle of the information resources, ensure that information can be found, protected and used as required and disposed of to meet legal and compliance requirements.

 

The Legal Perspective: Electronic Records and eDiscovery

In any organization, the Legal group has always been critical partner with Records and Information Managers as a result of incorporating legal and regulatory requirements into retention schedule development. In the past 15 to 20 years the interest from Legal departments and law firms in records and information management has increased as a result of electronic records and the role they play in litigation and eDiscovery.

For many Records Managers, a major shift in organizational records management awareness came as a result of changes to the U.S Federal Rules of Civil Procedure in 2006, which introduced “electronically stored information (ESI) a new type of discoverable information, under Rule 34. A major concern in organizations across the US came as a result of the volume of electronically stored legacy data and back up tapes that were subject to discovery, if the organization had not had an effective program in place to dispose of its information in the normal course of business.

With the advent of technology, the challenges of the discovery process were not unique to the US although for many of us, as members of ARMA International, the Sedona Conference and the changes to the Federal Rules were probably our first introductions to the connection between eDiscovery and records management. Ontario and other jurisdictions in Canada were facing the same challenges of discovery, given the proliferation of computers in the workplace.

In 2001 the Attorney General and Chief Justice of the Superior Court of Justice appointed a Discovery Task Force, chaired by Justice Colin Campbell, Superior Court of Justice, Toronto Region, to look at existing practices and propose options for reform. The report, presented in 2003, included two recommendations in its section under the “Discovery of Electronic Documents” expanding the scope of discovery:

  • Amend rules 30.01 and 31.01 to include in the definition of document “data created and stored in electronic form.
  • Develop best practices with respect to retention of electronic records and the scope, cost and manner of electronic documentary production.

No longer was discovery only about paper and the issues around electronic records and information management were again, brought to the forefront.

The work of The Sedona Conference has been pivotal in addressing issues around eDiscovery, Data Protection and Privacy and Information Governance. Sedona Working Group One (WG1), made up of representatives from the US Legal Community and members of ARMA International among others, focused on the development of electronic document retention and production guidelines, publishing The Sedona Principles; Best Practices Recommendations and Principles Addressing Electronic Document Production, in March 2003. The guidelines provided detailed interpretations and insights on how organizations could apply the Principles in preparing for litigation.

The Sedona Principles became an integral part of eDiscovery guideline development in Ontario as a result of the participation of Susan Wortzman<sup”>26, the first Canadian to attend a Sedona Conference meeting. Susan joined WG1 while a member of the Ontario Discovery Task Force. As a result of her participation on WG1, Ms. Wortzman worked with The Sedona Conference to set up Working Group 7 (WG7), Sedona Canada, which created the Sedona Canada Principles. As stated on The Sedona Conference website, WG7 was formed in 2006 with the mission:

“To create forward-looking principles and best practice recommendations for lawyers, courts, businesses, and others who regularly confront e-discovery issues in Canada.” The first edition of these Sedona Canada Principles27 was released in early 2008 (in both English and French) and was immediately recognized by federal and provincial courts as an authoritative source of guidance for Canadian practitioners. It was explicitly referenced in the Ontario Rules of Civil Procedure and practice directives that went into effect in January 2010.

In November 2015, the 2nd edition of The Sedona Canada Principles was issued and Working Group 7, open to interested Canadian residents, continues its work on eDiscovery and information governance issues in Canada.

 

The Electronic Discovery Reference Model

Launched in May 2005, the EDRM28 was established to address the lack of standards and guidelines in the e-discovery market. One of the outputs from EDRM was the Electronic Discovery Reference Model, published in 2006, designed to define the steps in an eDiscovery process in which the first step was records management. Putting records management as the first activity in the model showed the importance of ensuring that records were managed and disposed of in accordance with retention schedules and business practices. The premise was that by managing those resources effectively, there was less data/information to be waded through in case of litigation. What is interesting to note in this evolutionary process, is that while the 2014 version of the Electronic Discovery Reference Model (EDRM) shows information governance as the first box in the litigation/eDiscovery process, between 2006 and 2016, the first box changed from records management (2006) to information management (2007) to information governance (2014), showing the change in the workplace and the overall thinking about records, information and technology.

The EDRM website describes information governance as:

“Getting your electronic house in order to mitigate risk & expenses should e-discovery become an issue, from initial creation of ESI (electronically stored information) through its final disposition.”

eDiscovery has driven changes in the way organizations view their information assets as a result of the liabilities and risks associated with not managing information properly. So how does risk management fit into the picture and how do we assess our information risk?

 

RIM Risk Assessment Supports Business Strategies

Within any organization there are many different types of risks which are assessed to protect the organization’s operations. These may be overall business risks, operational risks, financial risks and on and on. They are usually defined and quantified so that they are measurable. And behind all these risks are records and information that capture details about the business activities of and decisions made by the organization.

Many of us have applied for credit cards, loans and mortgages. We are assessed on our credit history and provided with a yes or no response, based on the risk that lending the money to us poses to the financial institution. There are measurement criteria, assessment models, and methods to weigh the results of the assessments, all of which are quantified as part of a risk management framework. In order to make a decision, people requesting the loan complete forms, staff do analyses and provide reports of their decisions and the documentation, in whatever format it is collected, create a history of the transaction. How important are those risk analysis records to the organization?

In the case of technology installations and implementation, a risk analysis looks at all the aspects of the implementation – what are the potential risk factors that may impact the project, what is the impact if one of those factors occurs and how can you minimize the impact and reduce the risks. The records created document the decisions made and provide a tracking mechanism throughout the project.

For many of us who implemented vital records programs as part of business recovery initiatives, we were involved in undertaking a risk assessment around the value of the records to the organization should some type of emergency arise, such as a natural disaster, physical damage within our facilities such as fires, burst pipes, etc. or a frustrated employee stole data or sabotaged the system. We looked at the potential occurrences and the frequency with which they might occur, analysed the scenarios and determined which of the records created and stored were either high risk, if they were lost, or required as critical to the start-up of the business post disaster.

Because of the large volumes of information created and stored in our current workplace, organizations have begun to take a risk-based approach to managing the records which are captured in Managing Risks for Records29 in which the author, Dr. Victoria Lemieux, presented two approaches to records and information risk assessments:

  • Event-based risk assessments such as the ones used to determine risks typically used in vital records programs,
  • Records and information requirements based approach.

In the second approach, Dr. Lemieux suggests that rather than events being the basis of the risk assessment, the value of the records to the strategic business direction results in a cross-functional approach to managing information risk. Her recommendation for records and information risk management administration is to integrate it into the overall risk management function and culture, business operations, training and strategic development and budgeting, rather than have it as a separate, standalone activity within a records management program. Her book presents details about the two approaches and provides examples of the consequences of filing to manage records and information risks as highlighted below:

As with the other components we have discussed so far as part of the information governance Framework, managing risk is an integral part of the activities and resources, such as Dr. Lemieux’ book are available to assist in the transition from the traditional view of vital records issues and risks to the business risks of the organization, something that RIM professionals need to understand and be able to discuss as part of the cross-functional team.

 

Rm to IM to IG: Changing Skill Sets

So how do we know what we need to know to be successful RIM and IG professionals, if the world and our environment are changing around us daily? As the records management profession started to change in the 1990’s, there was an identified need to define what activities fell into the “records management” profession. A review of the Canadian Federal Government’s job classification codes showed that there were no clearly defined categories for records managers because the profession itself was not clearly defined. Skills and competencies existed for professions such as lawyers, doctors and accountants. No such things existed in the records and information management community in North America at the time.

 

ALARM Competency Model

In 1994 ARMA Canada participated in the Human Resources Development Canada’s (HRDC) Alliance of Libraries, Archives and Records Management30 initiative to examine human resource development challenges facing the Information Resources Sector31. In its Competency Tool Kit32, ALARM was described as:

Unique in the fact that it has brought together the three occupation areas (Libraries, Archives and Records Management) and has begun to demonstrate the promise of collaboration among the three professions in identifying and responding to common human resource needs.

The groups met for about five years and created not only the detailed set of competencies but also supporting toolkits on using the competencies. The ALARM33 competencies comprised seven professional competencies supported by three general sets of skills which defined at a high level, the core activities carried out in managing information resources. Committee members from the three professions agreed that, while managing a variety of information resources, Librarians, Archivists and Records Managers:

  • Create and maintain programs and services
  • Acquire and dispose of information resources
  • Create a framework for access to information resources
  • Provide reference, research and advisory services
  • Provide electronic services
  • Store and protect information resources

And required:

  • Business/management skills
  • Interpersonal skills
  • Personal skills

The ALARM competencies were used for hiring, selecting, training and managing the performance of staff in addition to supporting RIM education program development and were, perhaps, ahead of their time from the perspective of collaboration between the individual professions.

Were we alone in developing competency models – definitely not. Were we, again, leaders in the process – yes, we were.

 

ARMA International Competency Models34

The first set of competency models developed by ARMA International in 2007 focused on Records and Information Management competencies and differed from the ALARM competencies in that they broke the competencies into four levels from entry-level practitioner to executive-level professional and six domains that reflect:

  • Business Functions
  • RIM Practices
  • Risk Management
  • Communications and Marketing
  • Information Technology
  • Leadership

As records management moved to information management, the Canadian General Standards Board (CGSB) issued CGSB-192.2-2009: Competencies of the Federal Government Information Management Community. Its format differed from both ARMA and ALARM and as of the date of this article35, a review had been proposed but did not happen due to a lack of interest. The standard is still available on the CGSB website.

ARMA International began to expand its member offerings to include information governance and created the Information Governance Professional (IGP) certification program in 2012. In describing the role of an Information Governance Professional ARMA International states that:

A Certified Information Governance Professional creates and oversees programs to govern the information assets of the enterprise. The IGP partners with the business to facilitate innovation and competitive advantage, while ensuring strategic and operational alignment of business, legal, compliance, and technology goals and objectives. The IGP oversees a program that supports organizational profitability, productivity, efficiency and protection.

To support the Information Professional designation, ARMA International created a set of competencies, complementary to ARMA International’s RIM competencies, which state that an IGP has the ability to:

  • Manage information Risk and Compliance
  • Develop an IG Strategic Plan
  • Develop the IG Framework
  • Establish the IG program
  • Establish IG Business Integration and Oversight
  • Align Technology with IG Framework

In terms of defining what an IG program will look like the competencies provide an overview of program activities, through the domains and related knowledge and skills. They can also help records and information management professionals determine where there are gaps in their existing skill set and develop a personal career path, identifying opportunities for training and education.

My personal belief is that information governance is a response to a changing workplace, hugely impacted by technology and requires collaboration between several groups to ensure consistency and reduce duplication of effort in managing information resources. Records and information management skills sets are being enhanced and expanded alongside other professions such as legal, privacy, risk and IT, driven by a need to address what are truly enterprise-wide issues.

 

Looking Ahead: Roles and Responsibilities

Looking at information governance from the 30,000 foot level, it is clear that information assets are the common element and that an information governance framework is required to ensure compliance, etc. However, unlike Records management, which has traditionally been the purview of one designated community, the information governance issues are more complex and require input from a number of different communities based on their needs and concerns. The concerns vary, depending on the specific group:

  • The Privacy Officer ensures that personal information is collected, used and disposed of appropriately in accordance with privacy legislation.
  • The Legal department or law firm is concerned with ensuring that the internal and external clients are aware of the retention and eDiscovery issues around all records, information and data. The costs of searching through data to support litigation have, in the past few years, resulted in an increased awareness of the benefits of effective information governance as part of ongoing business practices.
  • The IT department is looking after all the systems and the data created and stored in them and has to ensure that information is secure, retrievable and accessible for as long as it is required through not only active data management but also through digital continuity and preservation to address changes in software and hardware.
  • Records and Information Management provides the RIM guidelines, standards and policies and procedures which ensure that information is managed appropriately from creation to disposition.
  • Employees are now far more aware of information assets and the impact of technology, although their expectations are that managing those assets will be transparent so that they get what they get what they need to do their day to day work. Anything which makes managing information onerous for the user will be rejected!

To further the discussion, the IGI created a RACI chart based on responses to their research:

Who is responsible for what will depend on the organization you are in and the focus of your strategic business goals, activity and regulatory environment. My father always used to say “if you know where you are going there is more than one road to take you there”. Such is the situation with information governance. However, any successful program will depend on a champion and a cross-functional team support by the necessary tools and technologies.

 

Conclusion

Some years ago, a session facilitator at the National Association of Government Archives and Records Administrators (NAGARA) Conference in Sacramento, California suggested that Archivists and Records Managers should change their story from one which was all about doom and gloom and full of jargon, to one which resonated with the creators of the records and information. Certainly the story is changing!

There is no doubt that information governance, under whatever name, will become a critical part of every organization given issues of privacy, eDiscovery, risk and compliance and value proposition of the information assets to the organization’s strategic position.

Records management has changed, not gone away, because there is still a need to manage records as evidence of business decisions and transactions. For records, data, information, knowledge, whatever it is called, to be useful to the organization it needs to be managed throughout its lifecycle. Discussions about what IG is and who is responsible will continue as our work changes. Each organization will design and implement a program based on its strategic direction, resource availability and risk and compliance requirements. The big shift is in the need to create cross-functional teams to address the issues from an enterprise perspective, not a siloed focus.

We have a rich resource of work that has been done in Canada as we have moved through the various challenges posed by technology and evolved from Records management to where we are today. We have a proud heritage of development and leadership in Archives, Records and Information Management. We can learn from it, build on it and look at the present research to see where we are heading in the future so that we are ready for the challenges ahead and embrace them. And our Canadian colleagues will continue to lead in different ways to enhance our understanding.

As I said at the beginning, this is not a comprehensive, all-inclusive study and I have omitted many people and projects, to whom and for which I apologize. There is much more to be added and I encourage you to build on it and prepare an article for the next ARMA Canada publication. The seeds have been planted for the flowers to grow. There is a huge opportunity to add to what has been started so let’s create the visible future!

 

Appendix A: ISO Standards

Information Technology

ISO/IEC 20000-1:2011 Information technology – Service management – Part 1: Service management system requirements
ISO/IEC 27014:2013 Information technology — Security techniques — Governance of information security
ISO/IEC 38500:2015 Information Technology Governance of IT for the organization

Information Management

ISO 30301:2011 Information and documentation – Management systems for records – Requirements
ISO 15489-1:2001 Information and documentation – Records management – Part 1: General (under revision
ISO 16175-1:2010 Information and documentation – Principles and functional requirements for records in electronic office environments – Part 1: Overview and statement of principles
ISO/TR 17068:2012 Information and documentation – Trusted third party repository for digital records
ISO/TR 18128:2014 Information and documentation – Risk assessment for records processes and systems
ISO 23081-1:2006 Information and documentation – Records management processes – Metadata for records – Part 1: Principles
ISO/TR 26122:2008/Cor 1:2009 Information and documentation – Records management processes – Metadata for records – Part 1: Principles

 

Endnotes

URLs checked as of February 15, 2016

1Canadian Atlas online source http://www.canadiangeographic.ca/atlas/themes.aspx?id=connecting&sub=connecting_technology_wireless&lang=En
2See Gil Press: Forbes – http://www.forbes.com/sites/gilpress/2013/05/09/a-very-short-history-of-big-data/2/#170e0de71af0
3http://www.emc.com/collateral/analyst-reports/idc-digital-universe-are-you-ready.pdf
4http://www.emc.com/leadership/digital-universe/2014iview/index.htm
5https://www.igt.hscic.gov.uk/Home.aspx?tk=424106365366405&cb=7e8b504b-fbb7-488d-aefd-a8894cfb45b1&lnv=7&clnav=YES
6Note that Gartner, as do many organizations, uses the term archiving for setting aside inactive records into cheaper storage. For those involved in the archival profession, the use of the term causes confusion when distinguishing between the 5% of records which capture the long-term corporate memory of a private/public sector organization.
7TSC was founded in 1997 by Richard G. Braman, is a nonprofit, 501(c)(3) research and educational institute dedicated to the advanced study of law and policy in the areas of antitrust law, complex litigation, and intellectual property rights
8Images for Information governance framework
9www.edrm.net
10William Benedon was President of the American Records Management Association and editor of Records Management Quarterly and was awarded the Emmett Leahy Award in 1968 for his outstanding contribution in the field of records management
11Prentice-Hall, Inc. 1969
12A term made popular in the 1980s and 1990s as PCs became more widely used in the workplace
13As in other National Archives programs where the issue of preserving data for historical reference and research was a major concern, in addition to preserving historical records on paper.
14“Information: To Share Or Not To Share? The Information Governance Review”
15http://www.tbs-sct.gc.ca/pubs_pol/dcgpubs/tb_h4/holdings-fonds04-eng.asp
16Dr. Vale later became the head of the Information Management Branch in the Government of Ontario
17http://www.im.gov.ab.ca/documents/imtopics/IMFrameworkSummary.pdf;
http://www.im.gov.ab.ca/documents/imtopics/IMFrameworkReport.pdf
18http://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm
19coincides with the principles contained in the Guidelines on the Protection of Privacy and Transborder Flows of Personal Data of the Organization for Economic Co-operation and Development (OECD), to which Canada agreed in 1984
20Including records containing personal information
21The concept was created by Dr. Cavoukian “to capture the notion of embedding privacy into technology https://www.ipc.on.ca/english/Privacy/Introduction-to-PbD/
22https://www.ipc.on.ca/english/Resources/Discussion-Papers/Discussion-Papers-Summary/?id=329
23Replaced ISO 17799
24Provenance Systems was founded in 1989
25Published in 2001 by Open Text Corporation. The name Open Text has become OpenText since the book was first written. The author has referenced the name as it was at that time.
26Susan Wortzman is the founder of Wortzmans based in Toronto, Ontario. http://www.wortzmans.com
27https://www.google.com/search?sourceid=navclient&aq=&oq=Sedona+Canada+Principles&ie=UTF-8&rlz=1T4AURU_enCA499CA500&q=sedona+canada+principles+addressing+electronic+discovery&gs_l=hp..1.0l2j0i22i30l2.0.0.0.5630047………..0.zGJBBlnet3o
28Since its launch, EDRM has comprised 400 organizations, including 195 service and software providers, 88 corporations, 76 law firms, 24 governmental entities, 12 educational institutions and 5 industry groups involved with e-discovery and information governance.
29Lemieux, Victoria, Managing Risks for Records and Information. Lenexa, KS; ARMA International, 2004
30The committee was made up of ARMA International, CCA, AAQ and ACA and Associations representing different types of libraries across Canada.
31The title was selected in an attempt to create a common terminology that would cross all three areas.
32Published in August 2002, by the Cultural Human Resources Council
33http://www.culturalhrc.ca/heritage/e/01-01-00.php
34http://www.arma.org/r1/professional-development/education/competencies
352016/2/2

De la gestion des documents à la gouvernance de l’information

 

1.0 Introduction

Christine Ardern. The Information Management Specialists, février 2016

Au moment où j’ai commencé à écrire cet article, j’avais une idée bien précise du sujet que je souhaitais aborder en raison de l’expérience que j’avais acquise en gestion des documents et de l’information durant ma carrière qui s’est échelonnée de 1975 à 2015. Mon idée a toutefois évolué, tout comme le monde qui nous entoure. En pensant aux 40 années que j’ai consacrées à l’archivistique et à la gestion des documents, j’ai réalisé que ma carrière, qui m’a permis de travailler dans une variété d’organisations, m’avait aussi permis d’expérimenter tous les aspects de la gouvernance de l’information, bien que celle-ci ait porté différents noms au fil des ans. Je suis convaincue que le fait de comprendre d’où nous venons nous aide à mettre en perspective ce qui nous attend dans le futur.

Mon intention n’était pas de produire une étude exhaustive sur la transition de la gestion des documents vers la gouvernance de l’information. Mon article visait avant tout à donner un aperçu personnel de la façon dont notre pratique a évolué au cours des dernières décennies. Pour appuyer mes propos, je décris d’ailleurs quelques projets canadiens qui ont influencé cette évolution et qui ont eu des retombées importantes. J’admets que j’ai eu le privilège de travailler au sein d’organisations qui m’ont offert des défis variés en gestion documentaire. J’ai eu la possibilité de participer à un grand nombre de projets, depuis l’élaboration de stratégies jusqu’à l’exploitation de dépôts de documents en passant par la mise sur pied d’installations d’archivage et la réalisation d’un mandat à titre de conseillère en gestion des documents et de l’information au sein de l’une des plus importantes institutions financières du Canada. Nul besoin de vous dire que toutes ces expériences m’ont offert de précieuses possibilités d’apprentissage et m’ont mise en contact avec les nombreuses formes de ce qui est maintenant convenu d’appeler la gouvernance de l’information.

J’ai assisté à plusieurs conférences organisées par l’ARMA International au fil des ans (la première fut celle de Toronto en 1975!) L’un des exposés qui m’a le plus marquée est celui qui a été présenté par Daniel Burrus, le conférencier vedette du congrès organisé par l’ARMA International en 2005, à Chicago.

M. Burrus avait alors abordé le thème du futur visible dans sa présentation intitulée Future View, A Look Ahead. L’un de principaux thèmes qu’il défend est expliqué dans l’énoncé suivant :

« Je vous invite à visiter un lieu que j’appelle le futur visible. C’est un endroit que vous pouvez clairement distinguer, mais pour y parvenir, vous devez prendre le temps de regarder. La plupart d’entre nous ne prennent jamais le temps de regarder. Le futur visible est un futur qui est pleinement prévisible. Plus vous regardez le futur et réfléchissez à la forme sous laquelle vous savez qu’il se manifestera, plus vous pourrez tirer profit de ce futur [traduction]. »

En 1995, lors d’une autre conférence de l’ARMA International qui avait lieu cette fois en Nouvelle- Zélande, l’un des conférenciers nous a montré à quoi ressemblerait le téléphone du futur en se basant sur des recherches réalisées dans ce domaine. Ses prédictions semblaient complètement farfelues. Imaginez! Être capable de voir les personnes avec qui vous discutez! Vingt ans plus tard, force est d’admettre que ce conférencier avait raison. Voilà le futur visible. L’argument principal de Daniel Burrus est le suivant : si nous prêtons attention aux innovations et aux recherches en cours dans des organisations telles que le MIT et le CERN, nous serons en mesure, grâce à ces organisations, d’entrevoir ce que le futur nous réserve.

En examinant le chemin que nous avons parcouru, il est facile de constater que les progrès de la technologie ont façonné l’environnement de mégadonnées dans lequel nous évoluons aujourd’hui. Peut-on imaginer un monde sans courriel et sans pages Web? C’était pourtant le cas il y a 50 ans. Comment la technologie a-t-elle changé notre milieu de travail et nous a-t-elle menés à cette ère de gouvernance de l’information? En est-on arrivé là soudainement ou cette nouvelle réalité découle-t-elle d’un processus évolutionnaire? Comme le démontrent les dates suivantes1, l’innovation ne s’arrête jamais :

  • Années 1930 : conception de l’ordinateur central
  • 1969 : création d’ARPANET, l’ancêtre direct d’Internet
  • 1976 : création de l’ordinateur personnel Apple 1
  • 1979 : mise en place du premier réseau de téléphonie cellulaire au Japon
  • 1981 : vente des premiers ordinateurs personnels IBM au public
  • 1989 : Tim Berners-Lee et Robert Cailliau mettent au point le prototype du World Wide Web, le réseau informatique mondial, au CERN (Organisation Européenne pour la Recherche Nucléaire)
  • 1994 : le gouvernement américain abandonne le contrôle qu’il exerce sur Internet et le Web Wide Web naît. Vingt ans à peine nous séparent de ce moment. Comment le Web a-t-il influencé l’environnement de travail?

Qu’est-ce qui a changé et pourquoi ce changement survient-il maintenant? Au début des années 1990, les ordinateurs personnels ont commencé à s’infiltrer dans les bureaux, contribuant à diffuser l’idée que la gestion des documents ne serait désormais plus nécessaire. Les ordinateurs pouvaient se charger de tout. Le Web en était à ses premiers balbutiements et servait surtout à envoyer des courriels. Les organisations utilisaient encore massivement leur service de courrier interne et Postes Canada pour transmettre leurs lettres et leurs rapports. Avant l’arrivée d’Internet, les renseignements produits par les ordinateurs étaient gérés et sécurisés au sein même des organisations par les services responsables des technologies de l’information (TI). Ce n’est qu’un peu plus tard que les données ont été diffusées et transmises par voie électronique entre des services et des unités fonctionnelles situés dans des lieux et des pays différents. Internet a complètement changé le monde de l’information.

 

L’ère des mégadonnées

Les mégadonnées ne sont pas apparues du jour au lendemain. Déjà, au milieu des années 1940, des auteurs se sont penchés sur les problèmes soulevés par la création et la conservation des quantités de données générées par les ordinateurs2.

Pensez à votre organisation et à la façon dont les informations y sont reçues, utilisées, transmises et conservées. Songez ensuite à quel point il en va de même dans votre vie personnelle. Combien de dispositifs générant des données possédez-vous dans VOTRE habitation? Aujourd’hui, combien existe-t- il de façons différentes de créer et de partager de l’information entre nous? La liste suivante donne un

aperçu des raisons pour lesquelles les volumes de données ne cessent de croître chaque jour. Et si ces données soutiennent vos activités organisationnelles, il est alors essentiel de les gérer adéquatement.

  • Les transmissions internet et sans fil nous permettent de produire et de partager des informations au moyen de messages textes, de blogues, de Twitter, d’Instagram, de Facebook et d’une myriade d’autres médias sociaux.
  • Les appareils intelligents communiquent entre eux. Les sociétés de services publics (électricité et gaz) installent des compteurs intelligents. Le système OnStar peut vous informer des travaux d’entretien requis par votre voiture à partir des données recueillies par l’ordinateur de bord du véhicule. Les systèmes GPS dont sont dotés vos téléphones et vos tablettes peuvent vous indiquer à quel endroit vous êtes. Toutes ces données sont conservées par les organisations qui en font le suivi.
  • Les grandes organisations telles que les institutions financières et les compagnies d’assurance recueillent des quantités phénoménales de données chaque jour en raison des transactions bancaires que les consommateurs réalisent en ligne à leur domicile, au moyen de guichets automatiques ou d’autres modes de prestation de services mis à leur disposition par les institutions.
  • En plus des données structurées conservées dans les divers systèmes, il faut aussi tenir compte des données non structurées produites par les employés dans leur travail quotidien au moyen, notamment, des répertoires de réseaux, des courriels et des applications de système qui soutiennent la gestion de contenu.

En 2007, EMC et IDC3 ont publié une première étude sur l’univers numérique dans laquelle ils tentaient de prédire la croissance du volume de données généré par le Web. Les prévisions qu’ils ont publiées en 2014 indiquent que d’ici 2020, l’univers numérique les données que nous créons et copions chaque année atteindront 44 zettaoctets ou 44 milliers de milliards de gigaoctets4.

La maxime selon laquelle « le stockage ne coûte pas cher » et qu’on peut se permettre de tout garder revient maintenant hanter les services de TI qui sont confrontés à des masses de données anciennes souvent inaccessibles. Au fil du temps, la gestion des données électroniques devient onéreuse en raison de l’évolution et de la désuétude rapides des logiciels. Toute organisation doit éliminer des éléments dans le cours normal de ses activités afin de réduire ses dépenses et les risques auxquels elle s’expose, et d’améliorer son efficacité. Mais prenez garde de ne pas vous débarrasser de données essentielles. Vous ne savez jamais quand un juge pourrait vous sommer de divulguer des renseignements. Tout manquement de votre part à fournir les données demandées pourrait vous coûter des millions de dollars en cette époque où les poursuites judiciaires abondent.


À quand remonte la gouvernance de l’information?

L’information fait partie de notre quotidien depuis fort longtemps, qu’on l’appelle matériel non documentaire, information transitoire, publication ou base de données. Jusqu’à tout récemment, les organisations conservaient (et conservent toujours) des documents papier comme principale source de preuves à l’appui de leurs transactions et décisions d’affaires. Avec l’avènement des TI et la prolifération des données transfrontières, des enjeux tels que la protection de la vie privée, l’appartenance et la sécurité des données prennent un tout nouveau sens lorsqu’on réalise à quel point il est facile d’accéder aux actifs informationnels électroniques.

L’information et les documents sont conservés dans un nombre sans cesse grandissant de lieux différents. Pendant ce temps, les organisations modifient leurs modes de prestation de façon à pouvoir servir leurs clients au moyen des médias sociaux et de l’infonuagique. Chaque jour, de nouvelles données sont produites et doivent être gérées, utilisées, stockées et supprimées conformément aux exigences opérationnelles et à la réglementation en vigueur. Au fur et à mesure que le volume de documents et de données à stocker électroniquement augmente, les pratiques employées pour gérer la masse de documents papier doivent maintenant être adaptées et améliorées pour répondre aux besoins suscités par la gestion documentaire électronique.

Plusieurs d’entre nous sont membres d’associations nord-américaines et les articles que nous lisons sont souvent rédigés en fonction de cette perspective. Depuis combien de temps parle-t-on de la « gouvernance de l’information » dans notre domaine? Cette question a été posée aux participants qui ont assisté à la conférence que l’Information Governance Initiative (IGI) a organisée en 2015 à Hartford, au Connecticut. Depuis cinq ans? Dix ans? Plus de dix ans? La majorité des participants ont choisi la première réponse : cinq ans. En tenant compte du sens que nous attribuons dans le présent article au concept de « gouvernance de l’information », la véritable réponse est « plus que quinze ans ». La nécessité d’assurer la protection des renseignements personnels est à l’origine du concept de gouvernance de l’information.

Le besoin d’assurer la confidentialité, la sécurité, la conservation et le traitement final des données une constante, quel que soit leur support de stockage ou leur nature. Les organisations reconnaissent maintenant qu’il est avantageux de réunir au sein d’équipes interfonctionnelles les services et les groupes qui travaillaient auparavant de façon compartimentée. En collaborant ainsi, ces équipes peuvent résoudre les problèmes globalement, à l’échelle de l’organisation, suivant ainsi la tendance imposée par les nouvelles technologies et l’élargissement de la portée des transmissions et des échanges. Les politiques et les procédures de chacun des services de l’organisation ne peuvent pas être élaborées indépendamment les unes des autres en fonction d’un centre d’intérêt particulier tel que la gestion des TI, de la protection de la vie privée, des activités commerciales ou juridiques, des risques ou de l’information. Les besoins se chevauchent et doivent être abordés comme les parties d’un tout. C’est là que le concept de gouvernance de l’information trouve son utilité en permettant de jeter des ponts entre ces services. Les différents aspects de la gouvernance de l’information sont en place depuis très longtemps, mais ils étaient sous la responsabilité de services de gestion distincts. Alors, comment définit-on la gouvernance de l’information aujourd’hui?


Perspectives en gouvernance de l’information

Tout comme la gouvernance de l’information a évolué, sa définition s’est aussi modifiée en fonction du milieu dans lequel elle était exercée et de la perspective adoptée. On retrouve toutefois des thèmes communs dans toutes les définitions de la gouvernance de l’information. Au Royaume-Uni, le National Health Service a conçu une trousse d’outils5 pour garantir que les renseignements recueillis dans le cadre de la prestation des services de santé seront traités conformément aux directives Caldicott (qui seront abordées plus loin). Ces directives stipulent que :

  • La gouvernance de l’information doit tenir compte de la façon dont les organisations traitent les renseignements. Elle vise les renseignements personnels, c’est-à-dire les renseignements qui concernent les patients, les utilisateurs des services et les employés. Elle cible également les renseignements relatifs aux organisations, c’est-à-dire les documents de nature financière et comptable.
  • La gouvernance de l’information fournit aux employés une méthode uniforme pour mettre en pratique les différentes règles régissant la gestion de l’information.

 

Gartner inc.

Fondée en 1979, la société Gartner est reconnue pour ses travaux de recherche et ses rapports qui font autorité dans les collectivités des gestionnaires des TI et de l’information. En 2007, Gartner a conclu que la gouvernance de l’information est un enjeu prioritaire pour ses clients et l’a défini comme suit :

« L’attribution de droits en matière de prise de décision et l’établissement d’un cadre de responsabilisation pour assurer l’adoption de comportements adéquats lors de l’évaluation, de la création, du stockage, de l’utilisation, de l’archivage6 et de la suppression de l’information. Ce concept englobe les processus, les rôles, les normes et les paramètres qui garantiront une utilisation efficace et efficiente de l’information et qui permettront à une organisation d’atteindre ses objectifs [traduction]. »

 

The Sedona Conference7

Une définition de la gouvernance de l’information a été proposée dans un article publié à l’automne 2014 dans le volume 15 du Sedona Conference Journal et intitulé « The Sedona Conference Commentary on Information Governance » (commentaire sur la gouvernance de l’information formulé par The Sedona Conference) :

« [La gouvernance de l’information] désigne l’approche coordonnée et interdisciplinaire adoptée par une organisation dans le but, d’une part, de respecter les exigences établies en matière de conformité de l’information et, d’autre part, de gérer les risques liés à l’information, tout en optimisant la valeur de l’information. La gouvernance de l’information englobe et permet de rapprocher les diverses exigences juridiques et de conformité ainsi que les risques qui relèvent de différentes disciplines (gestion des documents et de l’information, confidentialité des données, sécurité de l’information, administration de la preuve électronique). L’examen des objectifs poursuivis par ces disciplines permet de dégager des chevauchements fonctionnels qu’il est possible de rentabiliser (s’ils sont synergiques), de coordonner (s’ils fonctionnent en parallèle) ou de réconcilier (s’ils entrent en conflit) [traduction]. »

 

L’Information Governance Initiative (IGI)

Formée en 2013, l’Information Governance Initiative (IGI) est à la fois un consortium interdisciplinaire et un centre d’études et de recherches dédié à la diffusion de pratiques et de technologies fondées sur les principes de la gouvernance de l’information. Cette diffusion s’effectue au moyen de recherches, de publications, d’activités de sensibilisation et de réseaux de pairs. Dans son rapport annuel 2015-2016, l’IGI a présenté sa définition de la gouvernance de l’information ainsi qu’un ensemble de composantes, qui ont été établies à partir des commentaires formulés par les membres de l’IGI :

« La gouvernance de l’information désigne les activités et les technologies qu’utilisent les organisations pour maximiser la valeur de l’information qu’elles détiennent, tout en minimisant les risques et les coûts s’y rattachant [traduction]. »

 

Les composantes de la gouvernance de l’information8

Dans son rapport annuel 2015-2016, l’IGI a publié les conclusions tirées d’une enquête envoyée aux membres de l’IGI afin de leur demander d’indiquer, parmi une liste de 22 activités, lesquelles s’inscrivaient dans le cadre de la gouvernance de l’information. Le graphique suivant illustre les résultats de l’enquête :

Un certain nombre de ces composantes, une fois regroupées, cadreraient avec le modèle de référence de la gouvernance de l’information tel qu’il a été défini en 2011 dans le livre blanc How the Information Governance Reference Model (IGRM) Complements ARMA International’s Generally Accepted Recordkeeping Principles (GARPMD)9 (comment le modèle de référence de la gouvernance de l’information complète les principes de conservation des documents généralement reconnus de l’ARMA International) :

« Le modèle de référence de la gouvernance de l’information soutient les principes de conservation des documents généralement reconnus de l’ARMA International en permettant de déterminer quels sont les groupes interfonctionnels des intervenants clés en matière de gouvernance de l’information et en définissant leurs objectifs communs au sein de l’organisation [traduction]. »

Le modèle de référence de la gouvernance de l’information dresse la liste des secteurs qui ont un intérêt pour les actifs informationnels détenus par l’organisation et qui doivent collaborer compte tenu de l’environnement de travail actuel. La gestion des documents et de l’information est l’une des composantes de cette approche interfonctionnelle.

En réalité, la façon dont vous ferez l’expérience des différentes composantes de la gouvernance de l’information variera en fonction du type d’organisation où vous mettez en pratique les principes de la gestion documentaire ou de vos responsabilités. Une personne qui travaille dans un conseil ou une commission scolaire au Canada depuis quelques années pourrait bien avoir été responsable de la protection de la vie privée, de la gestion des documents et du respect de la conformité. Au sein d’une institution financière, cette même personne pourrait occuper un poste de gestionnaire de documents et être responsable de l’intégration des exigences relatives à la sécurité, à la conservation et à la confidentialité des documents au processus de conception des systèmes.

 

Retour sur le passé

De 1960 à 2000 : période de transition

Pour comprendre comment s’est effectuée la transition de la gestion des documents à la gouvernance de l’information, il convient d’examiner en quoi consiste chacun de ces deux éléments et leur définition. En 1969, Bill Benedon10, qui était considéré comme l’une des sommités de la gestion des documents, a publié Records Management11, un ouvrage dédié, comme son titre le laisse deviner, à la conception et à la mise en œuvre de programmes en gestion des documents. Bill Benedon écrit :

« La gestion des documents est un terme bien choisi pour englober les activités de traitement de l’information actuelles et futures. Bien que des innovations telles que les bandes magnétiques et d’autres formes de documentation miniaturisée modifient la complexité du document, elles soulèvent les mêmes problèmes au chapitre de la conservation, du stockage, de la conception des formulaires, des besoins de production de rapports, de la protection des données et, bien entendu (le problème le plus ancien), des exigences liées au classement des documents, que nous désignons maintenant sous l’appellation plus recherchée d’extraction ou de recherche documentaire [traduction]. »

L’auteur ajoute : « il s’écoulera beaucoup de temps avant que le personnel des services de comptabilité et d’audit ne soit disposé à dire que nous pouvons supprimer un document source une fois celui reproduit sur un support automatisable [traduction] ».

Près de 50 ans plus tard, après avoir vu le document source papier migrer lentement vers un support numérique, il nous faut maintenant aborder les problèmes soulevés par le traitement final des documents numériques!

En 1981, la deuxième édition de l’ouvrage Information and Records Management écrit par Maedke, Robek et Brown a été publiée. Ce livre est devenu le fondement de nombreux programmes de formation en gestion des documents et de l’information. Dans cet ouvrage, on définit la gestion des documents comme suit :

« L’application d’un contrôle systématique et scientifique sur l’information consignée qui est requis pour assurer l’exécution des activités d’une organisation. Un tel contrôle est exercé sur la création, la distribution, l’utilisation, la conservation, l’extraction, la protection, la préservation et le traitement final de tous les types de documents au sein d’une organisation [traduction]. »

En 2001, la première norme internationale en gestion des documents (ISO 15489) publiée par l’Organisation internationale de normalisation (ISO) a défini le records management ainsi :

« Le champ de l’organisation et de la gestion en charge d’un contrôle efficace et systématique de la création, de la réception, de la conservation, de l’utilisation et du sort final des documents, y compris des méthodes de fixation et de préservation de la preuve et de l’information liées à la forme des documents. »

L’élément commun de toutes ces définitions est le besoin d’avoir recours à une approche systématique pour gérer les documents et l’information consignée. Alors, qu’est-ce qui a changé? Est-ce que les documents ont disparu ou ont-ils été absorbés par la gestion de l’information au moment où la technologie a commencé à émerger dans l’environnement de travail?

 

Coexistence des documents et de l’information

L’information, tout comme les documents, a toujours été produite et consignée sur plusieurs types de supports et dans une variété de formats. À l’époque où prévalait le support papier, nous disposions de documents, de matériel non documentaire et de documents éphémères (information transitoire). Les copies de commodité, les versions préliminaires de rapports, les rapports de recherche et les comptes rendus étaient tous produits quotidiennement dans le cadre des activités de l’organisation. Ils avaient des délais de conservation différents et étaient supprimés conformément aux politiques en vigueur. Cependant, ils n’étaient pas considérés comme des « documents » parce qu’ils ne constituaient pas une preuve des décisions prises par l’organisation.

Avant l’avènement des ordinateurs personnels, d’Internet et des médias sociaux, les bases de données servaient à stocker des renseignements et à imprimer les résultats des recherches qu’on y effectuait. Les rapports financiers, les résumés des bordereaux de paie, les feuilles d’inventaire, etc. étaient considérés comme des documents et assujettis aux délais de conservation adoptés par l’organisation. Du point de vue des TI, la rétention faisait référence à la période durant laquelle les données étaient conservées sur les serveurs ou sur les bandes magnétiques, plutôt qu’au contenu informationnel.

Alors que de plus en plus d’information était générée au moyen d’ordinateurs de bureau par les travailleurs du savoir12 et que les données étaient conservées dans des systèmes informatiques plutôt que d’être imprimées sur papier, les lois ont été modifiées afin d’autoriser la réalisation de transactions par voie électronique, au lieu des transactions sur papier. Les effets d’Internet ont été ressentis par tout le monde. Désormais, moins de temps était consacré à l’impression et au classement des dossiers et davantage d’information était stockée dans des fichiers communs sur des réseaux, d’anciens systèmes, des bandes de sauvegarde, etc. C’est ainsi qu’a commencé le cauchemar des dirigeants de services informatiques. Non seulement y avait-il de plus en plus d’information produite électroniquement, mais il fallait aussi gérer toutes les données désuètes conservées dans les anciens systèmes ainsi que toutes les vieilles bandes non indexées qui étaient entreposées dans les centres de traitement. Que faire avec toutes ces données?

 

Les archivistes : les pionniers du traitement des données électroniques

Division des archives ordinolingues : Archives publiques du Canada

Mon premier contact avec le concept de « documents électroniques » remonte à l’époque où j’étais archiviste à la Toronto Harbour Commission. Au cours d’une visite aux Archives publiques du Canada au milieu des années 1970, j’ai rencontré le personnel de la Division des archives ordinolingues afin d’en apprendre davantage sur le fonctionnement d’un centre de documents, l’élaboration de politiques et de procédures en gestion des documents et d’autres aspects du domaine archivistique. Les archivistes ont dû apprendre à gérer et à préserver à long terme les données stockées dans les systèmes informatiques bien avant que les gestionnaires de documents, les services de TI et le Canada ne deviennent des chefs de file dans le traitement des documents lisibles par machine. Dans son article « FOCUS: The Machine Readable Archives Division of the Public Archives of Canada » (Archivaria, 1978, pp. 176-180), Harold Naugler écrit que :

« Dans le cadre du programme de gestion des documents informatisés du gouvernement fédéral… le Bureau des services de gestion des documents d’Archives publiques du Canada a commencé en 1976-1977 à faire l’inventaire des documents lisibles par machine dans environ 67 ministères et organismes gouvernementaux [traduction]. »

M. Naugler ajoute : « qu’il y avait de vastes quantités d’information sur support lisible par machine concernant des aspects aussi variés de la vie au Canada que l’emploi, la criminalité, les maladies, l’immigration et l’émigration, le climat, la géologie, la production et la consommation d’aliments, le logement, le transport, les communications et le coût de la vie [traduction]. »

La méthodologie utilisée durant le processus d’inventaire a ensuite été transformée en un ensemble de lignes directrices par la Division des archives ordinolingues13. Finalement, ces lignes directrices ont été intégrées au Guide d’administration de l’informatique pour les ministères et les organismes du gouvernement du Canada.

En 1984, reconnaissant que la préservation de l’information numérique était un enjeu mondial, Harold Naugler a écrit : Évaluation et tri des documents informatiques en archivistique : une étude RAMP, accompagnée de principes directeurs. L’ouvrage a été publié par le Programme général d’information de l’UNESCO et UNISIST. À cette époque, l’étude était considérée comme une référence en matière de gestion des données électroniques.

Par ailleurs, des groupes tels que l’Association of Canadian Archivists (ACA), la Society of American Archivists (SAA) et le Conseil international des archives (CIC) évaluaient les pratiques de conservation à long terme des documents et des données dans les systèmes informatiques. Pour être en mesure de gérer adéquatement ces données, fallait-il que le responsable comprenne en quoi consistaient les systèmes et quelles étaient les données produites, applique les règles de conservation et veille à ce que les formats et les supports de données soient adéquatement préservés? Oui. Appelait-on cela de la gouvernance d’information? Non. C’était l’une des pièces du casse-tête.

 

De la gestion des documents à la gouvernance de l’information : révolution ou évolution?

En 1989, lorsque je suis entrée au service de la CIBC à titre de directrice du Service des archives et de la gestion des documents, je faisais partie du comité de gouvernance de l’entreprise dont la composition reflétait la vision que l’on avait de la gouvernance de l’information à ce moment-là. Le comité était formé de personnes issues de différentes équipes au sein de l’entreprise : archives et gestion des documents, services juridiques, conformité, protection de la vie privée, audit et sécurité organisationnelle. Le groupe de la gestion des documents collaborait avec les autres membres du service pour réaliser des projets particuliers. Les problèmes étaient de nature interfonctionnelle et de trop grande envergure pour être réglés par le personnel d’un seul service. Le groupe de la gestion des

documents travaillait avec celui des TI en vue d’intégrer les règles de conservation des documents au cycle de vie de la mise au point des applications. Quant au choix des logiciels, il était fait conjointement avec des membres de l’équipe de la gestion des documents et de l’information. L’équipe de la gestion des documents et de l’information était formée de conseillers internes qui prenaient part aux décisions relatives aux documents électroniques et aux données transactionnelles ainsi qu’au choix du matériel et des logiciels.

Plusieurs articles et publications qui font référence au commencement du concept de gouvernance de l’information soulignent l’apport des National Health Services (NHS) de Grande-Bretagne à titre de chef de file ayant contribué à l’établissement d’un cadre de travail en gouvernance de l’information. En 1997, préoccupée par la protection de la vie privée et les effets des nouvelles technologies sur les renseignements médicaux, madame Fiona Caldicott a présidé un groupe d’experts afin d’examiner le traitement des renseignements médicaux pouvant permettre d’identifier un patient dans l’ensemble des NHS. À la suite de cet examen et d’autres travaux ultérieurs, sept principes, connus sous l’appellation

« principes Caldicott »14, ont été élaborés. Ces principes continuent d’être utilisés aujourd’hui pour évaluer si les renseignements concernant les patients sont adéquatement gérés et protégés. Ils sont devenus l’un des piliers des NHS et sont appuyés par un ensemble de trousses d’outils, de politiques, de procédures et de lignes directrices.

Le niveau d’exposition des professionnels en gestion des documents et de l’information à certaines ou à toutes les composantes de la gouvernance de l’information dépend du type d’organisation dans laquelle ils travaillent. Une personne qui travaille pour une petite municipalité à titre de secrétaire de mairie peut être responsable de la gestion des documents et de l’information, de la protection de la vie privée et des services juridiques. Au sein d’une grande institution financière, un professionnel de la gestion des documents et de l’information peut faire partie d’une équipe interfonctionnelle dont les membres sont responsables des composantes de la gouvernance de l’information. Pour un professionnel de la gestion des documents et de l’information évoluant dans un cabinet d’avocats, l’administration de la preuve électronique peut constituer le facteur déterminant qui sous-tend la façon dont la gouvernance de l’information est appliquée en ce qui a trait aux clients. L’environnement réglementaire régissant les activités de l’organisation peut demander que l’on mette davantage l’accent sur certains aspects de la gouvernance de l’information, mais en fin de compte la réussite d’un programme de gouvernance de l’information repose sur la synergie de l’ensemble de ses parties.

 

La gestion de l’information au gouvernement du Canada

La gestion de l’information et la gestion des documents coexistent depuis plusieurs années au sein du gouvernement du Canada. Le programme de gestion de l’information du gouvernement canadien résulte de la mise en place, en 1989, de la Politique de gestion des renseignements détenus par le gouvernement. Cette initiative du Secrétariat du Conseil du Trésor visait à « réunir sous le concept de la gestion de l’information les politiques existantes suivantes : Gestion des documents, Collecte de renseignement et recherche sur l’opinion publique, Micrographie, Gestion des documents informatiques et Gestion des imprimés administratifs [traduction] ».

En 199515, le Secrétariat du Conseil du Trésor a publié des lignes directrices sur la gestion de l’information gouvernementale ainsi qu’un modèle illustrant le cycle de vie des fonds de renseignements et stipulant les responsabilités des programmes de gestion de l’information, à savoir :

  1. La planification
  2. La collecte, la création et la réception
  3. L’organisation, la transmission, l’utilisation et l’extraction
  4. Le stockage, la protection et la conservation
  5. La disposition finale des documents au moyen de transferts ou de la destruction.

À cette époque, un manuel de l’utilisateur de 55 pages appuyait la mise en œuvre de la gestion de l’information dans l’ensemble des ministères du gouvernement fédéral. Le titre de la politique a changé en même temps que l’environnement de travail. En 2003, on a publié la Politique sur la gestion de l’information gouvernementale qui a été remplacée en 2007 par la Politique sur la gestion de l’information.

Pour expliquer la relation existant entre la gestion des documents et la gestion de l’information au sein du gouvernement du Canada, il faudrait y consacrer une étude entière et examiner en détail l’évolution du rôle et des responsabilités d’Archives publiques du Canada (qui sont plus tard devenues les Archives nationales, puis Bibliothèque et Archives Canada) et du Secrétariat du Conseil du Trésor dans l’élaboration des politiques. Ce qu’on appelait auparavant la gestion des documents au gouvernement fédéral est maintenant devenu les « pratiques en tenue de documents » en vertu de la Directive sur la tenue de documents, qui a été publiée pour la première fois en 2009 :

« La tenue de documents est une fonction de gestion des ressources grâce à laquelle les ressources documentaires ayant une valeur opérationnelle sont créées, acquises, saisies, gérées dans des dépôts ministériels, et utilisées à titre d’actif stratégique essentiel pour appuyer la prise de décisions efficace et faciliter des activités continues, exécuter des programmes et offrir des services. »

Qu’est-ce qui a changé? Toutes les organisations créent des ressources documentaires. Certains de ces documents doivent être conservés pour satisfaire à des prescriptions juridiques et des exigences en matière de conformité, tandis que d’autres peuvent être supprimés. En ce qui concerne la politique sur la tenue de documents du gouvernement fédéral, les documents sont devenus des ressources documentaires ayant une valeur opérationnelle. Pouvons-nous imaginer ce que nous réserve l’avenir?

 

Le programme de gestion de l’information du gouvernement de l’Alberta

En raison du transfert des dossiers papier dans des centres de documents à la fin de leur vie active, de nombreux programmes de gestion des documents ont été mis sur pied au sein des organisations. Les responsables et le public percevaient généralement la gestion des documents comme une activité physique centrée sur la gestion des documents papier prenant place dans un entrepôt ou un dépôt de documents et, malgré l’arrivée de la technologie dans les bureaux, on a continué à penser ainsi. Les intervenants avaient de la difficulté à accepter le fait que les ordinateurs généraient des données et des renseignements (dont certains avaient une valeur de « documents ») et pouvaient servir de lieu de stockage. Au fur et à mesure que la présence des ordinateurs s’est intensifiée dans les bureaux, les organisations ont commencé à faire la transition de la gestion des documents vers la gestion de l’information.

Comment fallait-il gérer l’information? Dans un environnement centré sur le traitement des documents papier, ce processus avait été défini en termes de documents éphémères et de matériel non documentaire. Dans un monde où l’électronique prévalait, l’information était saisie et conservée sous forme d’une série de zéro et d’un. Il fallait donc changer le concept de « document » afin d’englober les besoins changeants du milieu de travail. L’important, c’était le contenu et non le support sur lequel celui-ci était emmagasiné. Par conséquent, diverses lois portant sur les opérations électroniques et les documents informatiques ont été adoptées pour refléter la nouvelle réalité du monde du travail.

Au Canada, l’un des principaux programmes de gestion de l’information gouvernementaux fut l’Information and Technology Strategy adoptée en 2001 par le Comité des sous-ministres du gouvernement de l’Alberta. La transition du concept de « document » à celui d’« information » a été principalement motivée par le besoin de changer la perception que le terme « document » évoquait, à savoir des données sur support papier, alors que le milieu du travail évoluait dans un environnement où l’information, dont une partie avait une valeur documentaire, était de plus en plus conservée sous une forme électronique.

Reconnaissant qu’il fallait gérer les actifs informationnels des gouvernements, y compris les documents, Sue Kessler, qui était alors directrice de la gestion des documents et qui travaillait étroitement avec le Dr Mark Vale16, a mis au point un Cadre de gestion de l’information17. Ce cadre intégrait la majorité des éléments dont les modèles de gouvernance de l’information actuels tiennent compte.

Les ressources et les principes directeurs établis par le gouvernement de l’Alberta en matière de gestion de l’information ont été, et continuent, d’être utilisés non seulement par les ministères du

gouvernement albertain, mais aussi par le secteur privé et d’autres gouvernements souhaitant faire la transition de la gestion des documents vers la gestion de l’information.

En même temps que la gestion des documents évoluait en gestion, puis en gouvernance de l’information, la portée de ces disciplines s’est étendue même si leurs activités fondamentales de gestion n’ont pas changé. Quelle que soit la forme sous laquelle l’information se présente, nous avons encore besoin :

  • De stratégies et de cadres;
  • De politiques, de processus et de procédures;
  • De normes et de lignes directrices;
  • De personnel;
  • De technologie.

Bien que les supports au moyen desquels les documents et les renseignements sont diffusés changent et que le volume d’information ne cesse d’augmenter, les concepts de cycle de vie et de continuum n’ont pas disparu, tout comme les exigences de reddition de comptes et de conformité auxquelles les organisations sont encore tenues de se plier.

 

Protection des données et protection de la vie privée

Facteurs déterminants : les lignes directrices de l’OCDE18

Les organisations internationales telles que l’Organisation de coopération et de développement économique (OCDE), dont le Canada est devenu membre en 1984, se sont penchées sur la nécessité d’assurer la surveillance des données conservées dans les systèmes informatiques. En 1980, l’OCDE a élaboré les Lignes directrices régissant la protection de la vie privée et les flux transfrontières de données de caractère personnel dans lesquelles elle énonce :

« Compte tenu de l’essor pris par le traitement automatique de l’information, qui permet de transmettre de vastes quantités de données en quelques secondes à travers les frontières nationales et même à travers les continents, il a fallu étudier la question de la protection de la vie privée sous l’angle des données de caractère personnel. »

Préoccupée par le fait que la quantité de renseignements personnels conservée dans les grands systèmes de traitement de données pourrait éventuellement mettre en danger leur confidentialité, l’OCDE a invité les pays membres à élaborer et adopter leurs propres lois nationales en matière de protection de la vie privée et des renseignements personnels, ce que le Canada a fait en 1983.

En raison des technologies émergentes, non seulement la protection de la vie privée devenait une source d’inquiétude, mais les données qui pouvaient maintenant être facilement transmises au-delà des frontières nationales soulevaient d’autres préoccupations au chapitre des lois nationales portant notamment sur la propriété des données. En effet, les lois locales régissant les droits de propriété des données pouvaient restreindre la capacité d’une organisation à diffuser de l’information à l’intérieur même de sa structure si elle exerçait ses activités à l’échelle mondiale. Les limitations imposées à la transmission transfrontière des données inquiétaient particulièrement les banques et les compagnies d’assurance en raison de la nature internationale de leurs activités commerciales.
Le modèle fourni par les Lignes directrices de l’OCDE visait à soutenir l’harmonisation des lois sur la protection de la vie privée tout en favorisant la transmission transfrontière des flux de données à mesure que la technologie modifiait les méthodes de diffusion des données. Lors des célébrations entourant le 30e anniversaire des Lignes directrices en 2011, les responsables de l’OCDE ont déclaré que :

« Les systèmes autonomes des années 1970 se sont transformés en une infrastructure intégrée et omniprésente d’envergure mondiale. Les flux de données occasionnels d’autrefois ont fait place à des flux de données multipoints et continus à l’échelle internationale. Tous ces changements soulignent la nécessité pour les organismes de réglementation de la vie privée du monde entier de collaborer en vue d’élaborer des approches globales dans ce domaine. Les progrès réalisés en analytique et la monétisation de notre empreinte numérique soulèvent des questions épineuses concernant le concept de renseignement personnel et la portée appropriée que doit avoir l’application des mesures de protection de la vie privée [traduction]. »

Le Canada, souhaitant se conformer aux Lignes directrices de l’OCDE en raison de son appartenance à cet organisme, a élaboré un ensemble de lois provinciales et nationales sur la protection de la vie privée et mis en place une solide infrastructure pour préserver la confidentialité des données transitant dans les organismes gouvernementaux. Reconnaissant la nécessité de donner accès à l’information tout en assurant la protection des données et la protection de la vie privée, le gouvernement fédéral a promulgué deux lois distinctes en 1983 : la Loi sur l’accès à l’information et la Loi sur la protection des renseignements personnels19. Avec l’augmentation de la quantité d’information conservée dans les bases de données et les systèmes informatiques des organismes gouvernementaux, des mesures de sécurité et de contrôle ont été incorporées aux cadres stratégiques dont nous avons précédemment parlé afin d’assurer une collecte et une gestion appropriées des renseignements personnels.

Quelles ont été les retombées de ces lois sur les organismes non gouvernementaux? Pourquoi les gestionnaires de documents œuvrant dans des secteurs non gouvernementaux devaient-ils connaître la Loi sur l’accès à l’information? Les documents qu’une organisation (p. ex., l’Armée du salut20) est tenue de transmettre au gouvernement fédéral sont conservés dans les bureaux des ministères du gouvernement canadien. Au moment de l’adoption de la Loi sur l’accès à l’information, il fallait donc que les gestionnaires de documents travaillant dans les milieux non gouvernementaux sachent, dans le cadre de leurs activités courantes de gestion des documents, quels documents étaient envoyés au gouvernement du Canada et quelles mesures avaient été mises en place pour contrôler l’accès et assurer la confidentialité des documents des tierces parties.


Renforcer activement la confidentialité des nouvelles technologies

La protection intégrée de la vie privée21

L’augmentation de la quantité d’information recueillie et stockée dans des systèmes informatiques et la nécessité d’assurer la confidentialité des données pour se conformer aux exigences législatives fédérales
et provinciales en matière de protection de la vie privée ont entraîné l’élaboration de directives et de mesures de contrôle.

L’Ontario a joué un rôle prépondérant dans le domaine de la protection de la vie privée tant au Canada que sur la scène internationale grâce aux travaux de Mme Ann Cavoukian, Ph. D., qui a occupé le poste de Commissaire à l’information et à la protection de la vie privée de l’Ontario de 1997 à 2014. Mme Cavoukian estimait (et estime encore) qu’au lieu d’envisager la protection de la vie privée comme un problème découlant de la technologie qui doit être résolu après coup, il fallait plutôt penser à intégrer des normes de respect de la vie privée dès la conception des infrastructures et des systèmes. En collaboration avec le Registratierkamer, l’organisme responsable de la protection de la vie privée des Pays-Bas, Ann Cavoukian a exposé ses idées dans un document paru en 1995 : Privacy-enhancing Technologies – A Path to Anonymity22 (technologies renforçant la protection de la vie privée une voie vers l’anonymat). Déterminée à faire du respect de la vie privée une pratique de gestion courante, Mme Cavoukian a créé le concept de privacy by design (protection intégrée de la vie privée). Cette démarche repose sur les sept principes suivants :
Prendre des mesures proactives et non réactives; des mesures préventives et non correctives;

  • Assurer la protection implicite de la vie privée;
  • Intégrer la protection de la vie privée dans la conception des systèmes et des pratiques;
  • Assurer une fonctionnalité intégrale selon un paradigme à somme positive et non à somme nulle;
  • Assurer la sécurité de bout en bout, pendant toute la période de conservation des renseignements;
  • Assurer la visibilité et la transparence;
  • Respecter la vie privée des utilisateurs.En 2010, le cadre conceptuel de la protection intégrée de la vie privée élaboré par Ann Cavoukian a été adopté comme nouvelle norme mondiale en matière de protection de la vie privée.

De nos jours, Mme Cavoukian poursuit ses travaux à l’Université Ryerson de Toronto où elle occupe le poste de directrice générale du Privacy and Big Data Institute. Ses travaux ont permis de mettre sur pied un programme en collaboration avec Deloitte Canada grâce auquel les organisations peuvent recevoir une accréditation de conformité si elles incorporent la notion de respect de la vie privée dans leurs pratiques quotidiennes et si elles respectent les sept principes de la protection intégrée de la vie privée.

 

Information et sécurité des données

Pour plusieurs d’entre nous qui avons commencé notre carrière en gestion des documents avant l’avènement des nouvelles technologies, le mot « sécurité » était synonyme de bureaux, de classeurs et d’espaces de travail verrouillés. Les classifications de sécurité étaient établies en fonction de la valeur opérationnelle de l’information au sein de l’organisation, qu’elle soit en format papier ou électronique. Les classifications de sécurité les plus courantes se déclinaient en données « confidentielles », « à diffusion restreinte », « à usage interne » et « non protégées ». Les organisations pouvaient adopter d’autres classifications selon le type de renseignements visés (p. ex., « très secret » et « secret » au sein des organismes gouvernementaux).

Avant la mise en place des contrôles de sécurité informatiques, des personnes spécifiquement désignées contrôlaient l’attribution des droits d’accès et de consultation de l’information disponible en ligne pour garantir que seules les bonnes personnes puissent accéder aux bons documents.

Puisque la majeure partie des ressources documentaires étaient gérées au sein même de l’organisation, il était plus facile de régler les problèmes liés à la sécurité physique. Avec l’arrivée progressive des systèmes automatisés, de l’infonuagique et d’Internet, la nécessité d’avoir recours à des mesures de contrôle a augmenté et les méthodes de mise en œuvre de ces mesures ont changé.

 

Les menaces actuelles

Nous entendons tous parler d’atteintes à la sécurité. En tant que gestionnaires des documents et de l’information, sommes-nous vraiment bien informés des risques auxquels sont exposés nos actifs informationnels?
La question de la sécurité de l’information n’est pas nouvelle. En 1986, reconnaissant que cette question touchait de près l’accessibilité et la confidentialité, le Secrétariat du Conseil du Trésor a mis en place la Politique sur la sécurité du gouvernement. Cette politique visait à « garantir que tous les renseignements et les biens classifiés et désignés du gouvernement fédéral soient protégés de façon appropriée [traduction] ».

Avec l’arrivée de l’Internet des objets, du mode PAP (prenez vos appareils personnels), des médias sociaux et de l’infonuagique, la sécurité de l’information devient une autre composante du cadre de la gouvernance de l’information. Des normes internationales telles que la norme ISO 27002 : Technologies de l’information Techniques de sécurité Code de bonne pratique pour la sécurité de l’information23 (l’une des normes qui encadrent la gestion et la sécurité de l’information figurant à l’annexe A) et des normes de sécurité sont élaborées dans le but de préciser les mesures, les pratiques, les procédures ou les mécanismes de contrôle de la sécurité qui peuvent :

  • Assurer une protection contre les menaces;
  • Réduire la vulnérabilité;
  • Atténuer l’effet d’un incident indésirable;
  • Détecter les incidents indésirables;
  • Faciliter la reprise des activités.

La cybercriminalité est devenue le problème numéro un que les gouvernements tentent de résoudre. Quelle que soit la cible des cybercriminels, à savoir les gouvernements, les entreprises ou les individus, cette menace peut avoir de graves conséquences. Nous entendons parler de violations informatiques aux bulletins de nouvelles et nous nous demandons quels effets elles pourraient avoir.

Récemment TalkTalk, un fournisseur de services de téléphonie et d’accès Internet du Royaume-Uni, a été la cible de pirates informatiques qui ont réussi à accéder aux données des comptes clients. Pendant que nous étions en vacances au Royaume-Uni, nous avons entendu la conversation suivante dans un pub le jour suivant l’intrusion informatique : le tenancier du bar venait justement d’être informé que sa banque avait téléphoné pour l’aviser que des pirates avaient réussi à s’introduire dans son compte bancaire et avaient tenté de soutirer son argent à l’aide de renseignements personnels obtenus lors de l’intrusion dans les comptes de TalkTalk. Ce qui était intéressant à propos de ce cas particulier était que TalkTalk avait fait l’objet d’une vérification environ deux ans auparavant. Les enquêteurs avaient averti TalkTalk que ses systèmes n’étaient pas bien protégés, mais en dépit de cet avertissement, rien n’avait été fait pour augmenter le niveau de sécurité des systèmes de l’entreprise.

Le 3 janvier 2016, on pouvait lire ce qui suit sur le site Web de TalkTalk :


« Bienvenue chez TalkTalk

Nous renforçons en ce moment la sécurité de notre site Web. Il sera prochainement remis en ligne. Pendant les travaux, notre équipe de service à la clientèle est à votre disposition pour vous aider si vous avez besoin de renseignements concernant les forfaits suivants ou si vous souhaitez vous procurer un nouveau forfait. N’hésitez pas à communiquer avec eux [traduction]. »

 

Il ne faut plus prendre pour acquis que la sécurité des données relève du service des TI. La sécurité, la confidentialité et le traitement final de l’information sont des problèmes qui doivent être réglés par l’ensemble des services qui sont responsables de l’identification et de la protection de l’information organisationnelle. Il est aujourd’hui vital que le gestionnaire des documents et de l’information soit en mesure de comprendre les concepts et les enjeux en matière de sécurité de l’information.

 

La gestion des documents face à la technologie

Durant mon premier emploi à titre d’archiviste à la Toronto Harbour Commission, on m’a confié la responsabilité d’acheter un système de traitement de texte Wang et de découvrir comment il fonctionnait. C’était en 1978, c’est-à-dire quelque temps avant l’arrivée et l’adoption des ordinateurs personnels comme technologie standard dans le monde du travail. En 1984, j’ai été appelée à participer à l’élaboration d’une stratégie globale sur les technologies au sein de l’AGO en raison du poste de directrice de l’Administration et des archives que j’y occupais.

En 1989, dans le cadre de mes fonctions à titre de directrice de la Gestion des archives et des documents à la CIBC, j’ai collaboré avec le personnel affecté à la gestion des documents et de l’information pour choisir et installer un logiciel permettant de gérer l’entreposage et la disposition finale d’environ 400 000 boîtes au Centre de documents de Toronto. En plus de sélectionner un logiciel adapté aux besoins de notre service en gestion des documents, l’équipe a participé au choix d’un système d’imagerie pour l’entreprise et travaillé avec l’équipe des TI pour intégrer des règles de conservation lors de la mise au point des systèmes. Comprendre les tenants et aboutissants des nouvelles technologies et de leurs retombées n’était pas l’un des à-côtés agréables que me procurait mon poste, c’était une obligation.
Pendant plusieurs années, le refrain « le stockage ne coûte pas cher » a été entendu dans nombre d’organisations tandis que le flot d’information généré par les systèmes informatiques augmentait et que les efforts des services de TI se concentraient sur le déplacement des données réelles vers le stockage hiérarchisé plutôt que sur la détermination de la valeur opérationnelle de ces données. Gérer toutes ces données était compliqué. On ne s’en est donc pas soucié jusqu’à ce que la menace du passage à l’an 2000 se profile à l’horizon et que l’on commence à s’inquiéter du sort réservé aux anciens systèmes. À cette époque, la plupart des organisations se concentraient encore sur la gestion des documents papier. Les données informatiques étaient conservées à des fins de sauvegarde et de sécurité, ou pour assurer la reprise des activités dans l’éventualité d’une catastrophe. En raison de la transition des documents papier vers les données informatiques et du manque de politiques de conservation applicables aux anciens systèmes et à la sauvegarde des données, les services de TI se sont soudainement retrouvés avec des téraoctets de données stockées dont ils ne pouvaient pas disposer parce que personne ne savait ce qu’elles contenaient. Quel aurait été le risque encouru si ces données avaient été supprimées sans en connaître le contenu? Comment aurait-on pu justifier un tel geste devant un tribunal? D’un autre côté, toutes ces vieilles bandes magnétiques qui ramassaient la poussière dans des centres de données sans avoir subi de catalogage ou de traitement de conservation (rembobinage, transfert, etc.) entravaient considérablement l’administration de la preuve électronique. Il fallait que les responsables de la gestion des documents et de l’information prennent part à ce débat.

 

Définition d’exigences en matière de logiciels : l’apport des Canadiens

Le Canada et les Canadiens ont été des chefs de file dans l’élaboration de spécifications en matière de logiciels de gestion des documents. Ils ont aussi contribué à la mise au point de produits logiciels respectant ces spécifications. Les premiers travaux dans ce domaine ont commencé au Canada en 1983 dans le cadre du programme d’essais pratiques des systèmes de bureautique entrepris par le ministère des Communications et les Archives nationales du Canada. Ce programme, conçu pour étudier comment 70 utilisateurs reliés entre eux au moyen d’un réseau local créaient de l’information électronique, l’utilisaient puis en disposaient, a fourni d’importantes données de recherche qui ont permis de concevoir des solutions. De ces travaux découle la mise en place du Logiciel d’avancement de la gestion de l’information et des systèmes de bureautique (IMOSA), une initiative conjointe des Archives nationales du Canada, du Centre canadien de recherche sur l’informatisation du travail, du ministère des Communications et de Provenance Systems24. En 1990, à la suite des travaux réalisés dans le cadre du projet IMOSA, les Archives nationales du Canada ont publié un ensemble d’exigences fonctionnelles applicable aux logiciels de gestion de l’information électronique à l’échelle du gouvernement fédéral et connu sous le nom de FOREMOST (Gestion traditionnelle des documents pour les systèmes de bureautique).

Au moment où les spécifications canadiennes ont été préparées, les membres de la collectivité des documents électroniques au sein du Conseil international des archives travaillaient aussi à l’élaboration de telles spécifications. Bien qu’il soit difficile de dire si le Canada a été le principal artisan de la définition des exigences applicables aux logiciels de gestion électronique des documents, il est juste d’affirmer que les travaux entrepris dans le cadre du projet IMOSA étaient à l’avant-garde dans ce
domaine. D’autres projets d’élaboration de spécifications logicielles ont été entrepris ultérieurement en Australie, aux États-Unis et en Europe et continuent d’être améliorés de nos jours.

 

Solutions logicielles de gestion électronique des documents

Une fois de plus, le Canada a joué un rôle de premier plan dans la mise au point de logiciels de gestion électronique des documents. À la suite des travaux menés par le gouvernement fédéral canadien, Bruce Miller a fondé Provenance Systems en 1989 et conçu FOREMOST, un logiciel de gestion électronique des documents. En 2002, FOREMOST a été vendu à la firme Documentum (EMC) qui l’a intégré à son produit de gestion des documents. Par la suite, Bruce Miller a mis sur pied la firme Tarian Software et conçu le logiciel Tarian eRecord Engine, une solution de gestion électronique des documents de deuxième génération qui a été acquise en 2002 par IBM qui l’a intégrée à son produit Records Manager.

Partout dans le monde, les gestionnaires de documents connaissent OpenText et sa suite logicielle LiveLink. L’étude de l’évolution de ces produits, depuis leur conception jusqu’à nos jours, démontre tous les efforts d’innovation et de conception déployés par le Canada dans le domaine de la gestion électronique des documents. Dans la préface qu’il a rédigée pour l’ouvrage Open Text Corporation: Ten Years of Innovation25, Tom Jenkins, qui était alors le chef de la direction d’Open Text Corporation, déclare :

« Il est difficile de l’imaginer aujourd’hui, mais Open Text Corporation a commencé ses activités en 1991 sous la forme d’une petite firme de trois consultants issue des travaux menés à l’Université de Waterloo [traduction]. »

Alors qu’Internet n’en était qu’à ses premiers balbutiements, OpenText a conçu l’un des premiers moteurs de recherche pour Netscape et Yahoo. Comment la firme a-t-elle été amenée à œuvrer dans le domaine de la gestion des documents? Évolution ou révolution?
Avant l’incursion d’OpenText dans le domaine des moteurs de recherche, les ministères du gouvernement fédéral canadien faisaient face à des défis en ce qui a trait à la gestion de leurs dossiers physiques. Afin de combler les lacunes observées, un groupe d’entrepreneurs d’Ottawa a eu l’idée, en 1986, de mettre au point iRIMS, un produit de PS Software. En raison de l’environnement en mutation, iRIMS a élargi sa gamme de produits pour permettre le traitement des documents électroniques et physiques, et des images. En 1999, iRIMS a été acheté par OpenText et ses fonctionnalités intégrées dans la suite de produits logiciels dont l’évolution se poursuit encore aujourd’hui.

Où en sommes-nous aujourd’hui? Les technologies ayant évolué, les fonctionnalités de ces produits ont fait de même. Les fonctionnalités de gestion des documents ont été incorporées à un certain nombre de produits logiciels axés sur les actifs informationnels des organisations. Que l’entreprise appelle cette fonctionnalité « gestion des documents », « gestion de l’information » ou « gouvernance de l’information », en fin de compte, ces logiciels nous aident à gérer le cycle de vie des ressources documentaires, et à faire en sorte que les documents puissent être retrouvés, protégés, utilisés au moment voulu et supprimés conformément à la réglementation en vigueur.

 

L’administration de la preuve électronique

Dans toute organisation, les services juridiques ont toujours été des partenaires précieux pour les gestionnaires responsables des documents et de l’information, car ils facilitent l’intégration des nombreuses exigences juridiques et réglementaires au calendrier de conservation. Au cours des 15 à 20 dernières années, l’intérêt que les services juridiques et les cabinets d’avocats portent au domaine de la gestion des documents et de l’information n’a cessé de croître avec l’arrivée des documents électroniques et du rôle de ces documents en matière de litiges et d’administration des éléments de preuve sur support électronique.

En 2006, des modifications apportées aux règles fédérales de procédure civile des États-Unis ont introduit le concept d’electronically stored information (information stockée électroniquement). Il s’agit d’un nouveau type d’information pouvant être assujettie à l’administration de la preuve électronique en vertu de la règle no 34. Nombreux sont les gestionnaires de documents qui ont été sensibilisés aux retombées majeures découlant de ce nouveau concept. Aux États-Unis, les organisations ont commencé à se préoccuper du sort des données conservées dans les anciens systèmes informatiques et des bandes magnétiques de sauvegarde parce que ces données étaient assujetties à l’administration de la preuve électronique et qu’il arrivait que les organisations qui en étaient responsables n’eussent pas mis en place de programmes efficaces de conservation et de traitement final des documents dans le cours normal de leurs activités.

Avec l’avènement des nouvelles technologies, les défis liés à l’administration de la preuve électronique n’ont pas été limités aux États-Unis, bien que pour plusieurs d’entre nous, à titre de membres de l’ARMA International, les événements entourant The Sedona Conference et la modification des règles fédérales de procédure civile aient probablement été les premiers signes qui nous ont fait prendre conscience des liens existant entre l’administration de la preuve électronique et la gestion des documents. L’Ontario et d’autres administrations au Canada ont dû faire face à des défis similaires au chapitre de l’administration de la preuve électronique en raison de la prolifération des ordinateurs dans les milieux de travail.

En 2001, le procureur général et le juge en chef de la Cour supérieure de justice ont formé un groupe de travail sur l’administration de la preuve électronique. Ce groupe, qui était dirigé par le juge Colin Campbell de la Cour supérieure de justice de la région de Toronto, avait pour mandat d’examiner les pratiques existantes et de proposer des réformes. Le rapport du groupe de travail, qui a été publié en 2003, incluait deux recommandations qui ont contribué à étendre la portée de l’administration de la preuve électronique :

  • Modifier les règles 30.01 et 31.01 afin d’inclure la définition de « données créées et enregistrées sur support électronique »;
  • Élaborer des pratiques exemplaires au chapitre de la conservation des documents électroniques et de la portée, du coût et des méthodes d’administration de la preuve électronique.

L’administration de la preuve ne se limitait désormais plus aux documents papier. Par conséquent, on s’est une fois de plus penché sur les enjeux liés à la gestion électronique des documents et de l’information.

Les travaux de The Sedona Conference ont été déterminants dans les progrès réalisés au chapitre de l’administration de la preuve électronique, de la protection des données, de la protection de la vie privée, et de la gouvernance de l’information. Le Groupe de travail no1 de The Sedona Conference (GT1), où siégeaient notamment des représentants de la collectivité juridique des États-Unis et des membres de l’ARMA International, a concentré ses travaux sur l’élaboration de lignes directrices portant sur la conservation et la communication des documents électroniques et publié, en mars 2003, le document intitulé : The Sedona Principles; Best Practices Recommendations and Principles Addressing Electronic Document Production (les principes de The Sedona Conference : pratiques exemplaires, recommandations et principes relatifs à l’administration de la preuve électronique). Ces lignes directrices visaient à fournir des interprétations précises de la manière dont les organisations peuvent mettre en pratique les principes de The Sedona Conference lors de la préparation des litiges.

Susan Wortzman26, la première Canadienne à avoir assisté à une réunion de The Sedona Conference, a pris part aux travaux d’élaboration des lignes directrices relatives à l’administration de la preuve électronique en Ontario, ce qui a entraîné l’intégration des principes énoncés par ce centre d’études et de recherche aux lignes directrices ontariennes. En raison de sa participation aux travaux du GT1, Mme Wortsman a aussi été amenée à collaborer avec The Sedona Conference pour former le Groupe de travail no 7 (GT7 ou Sedona Canada), qui a élaboré les principes de Sedona Canada. Comme le mentionne le site Web de The Sedona Conference, le GT7 a été formé en 2006 et sa mission consistait à :

« Établir des principes prospectifs et des recommandations de pratiques exemplaires à l’intention des avocats, des tribunaux, des entreprises et des autres intervenants qui sont régulièrement confrontés aux problèmes liés à l’administration de la preuve électronique au Canada [traduction]. »

La première édition des Principes de Sedona Canada concernant l’administration de la preuve électronique27a été publiée au début de 2008 en anglais et en français. Cette publication a été immédiatement reconnue par les tribunaux fédéraux et provinciaux comme un guide faisant autorité pour les praticiens canadiens. Les règles de procédure civile de l’Ontario et les directives de pratiques qui sont entrées en vigueur en janvier 2010 y faisaient clairement référence.

En novembre 2015, la deuxième édition de The Sedona Canada Principles a été publiée et le Groupe de travail no 7, auquel peuvent participer les résidents canadiens intéressés, continue ses travaux sur l’administration de la preuve électronique et les enjeux en matière de gouvernance de l’information au Canada.

 

Le modèle de référence de la preuve électronique (EDRM)

Fondé en mai 2005, l’organisme EDRM28 a pour objectif de remédier au manque de normes et de lignes directives dans le domaine de l’administration de la preuve électronique. Les travaux menés par EDRM ont abouti à la publication, en 2006, de l’Electronic Discovery Reference Model (modèle de référence de la preuve électronique). Ce modèle sert à définir les étapes du processus de communication des éléments de preuve sur support électronique. La première étape de ce processus était la gestion des documents. En plaçant la gestion des documents en tête du modèle de référence, les auteurs ont souligné l’importance de gérer les documents conformément aux calendriers de conservation et aux pratiques opérationnelles en vigueur. Le postulat sous-jacent était le suivant : en gérant efficacement les ressources documentaires, il y aura moins de données à analyser intégralement en cas de litige. Ce processus évolutif comporte un aspect intéressant. Bien que la version de 2014 du modèle de référence place la gouvernance de l’information dans la première case du processus de communication de la preuve électronique, entre 2006 et 2016, le contenu de la première case a été progressivement modifié. Ainsi, on a remplacé la gestion des documents (version de 2006) par la gestion de l’information (version de 2007), puis par la gouvernance de l’information (version de 2014). Ces changements successifs illustrent l’évolution du milieu de travail et de la perception générale qu’ont les intervenants des documents, de l’information et de la technologie.

Le site Web de l’organisme EDRM décrit la gouvernance de l’information ainsi :

« Elle consiste à mettre de l’ordre dans vos ressources électroniques, depuis la création initiale des informations stockées électroniquement jusqu’à leur traitement final, en vue d’atténuer les risques et les coûts découlant de problèmes liés à l’administration de la preuve électronique [traduction]. »

L’administration de la preuve électronique ainsi que les responsabilités et les risques associés à une gestion inadéquate de l’information sont à l’origine des changements de perception adoptés par les organisations envers leurs actifs informationnels. Alors, comment la gestion des risques s’inscrit-elle dans ce processus et comment pouvons-nous évaluer nos risques informationnels?


Évaluation des risques et stratégies opérationnelles

En vue de protéger adéquatement leurs activités, les organisations analysent plusieurs types de risques différents. Il peut s’agir notamment de risques de nature générale, de risques opérationnels ou de risques financiers. Il est habituellement possible de les définir et de les quantifier pour en faciliter l’évaluation. Derrière ces risques, on retrouve des documents et des données qui relatent en détail les activités et les décisions de l’organisation.

Nous avons tous, à un moment quelconque, soumis une demande de carte de crédit, de prêt personnel ou de prêt hypothécaire. Les institutions financières examinent soigneusement nos antécédents en matière de crédit avant d’accepter ou de refuser notre demande. Elles évaluent le risque financier que nous représentons pour elles. Ces décisions sont fondées sur des ensembles de critères, des modèles d’évaluation et diverses méthodes de pondération des résultats. Tous ces éléments sont quantifiés et font partie du cadre de gestion des risques de l’institution financière. Pour prendre une décision, le personnel de l’institution financière nous fait remplir une demande de prêt qui est ensuite analysée et transmise dans un rapport qui documente la décision prise. Cette documentation, quel que soit son format, constitue l’historique de la transaction. Jusqu’à quel point les documents relatant l’analyse du risque sont-ils importants pour l’organisation?

Lorsqu’il s’agit de nouvelles technologies, l’analyse du risque porte sur tous les aspects de leur mise en œuvre : quels sont les facteurs de risque pouvant avoir une incidence sur le projet? Quelles seront les conséquences si l’un de ces facteurs de risque se concrétise? Comment peut-on minimiser les conséquences et réduire les risques? Les documents produits durant l’analyse du risque appuient les décisions prises et fournissent un mécanisme de suivi pendant le projet.

Ceux d’entre nous qui ont mis en place des programmes de protection des documents essentiels dans le cadre de plans de continuité des activités ont eu l’occasion d’évaluer la valeur des documents de
l’organisation et de soupeser les risques auxquels ils sont exposés (p. ex., un désastre naturel, des dommages matériels à la suite d’un incendie ou d’une fuite d’eau, des dommages découlant d’un vol de données ou d’un sabotage du système). Nous avons eu la possibilité d’étudier les incidents pouvant survenir et leur fréquence, d’analyser divers scénarios et de déterminer les types de documents enregistrés qui représentaient un risque élevé pour l’organisation en cas de perte ou qui étaient considérés comme essentiels en cas d’une reprise des activités après une catastrophe.

En raison des grandes quantités de données créées et enregistrées dans les lieux de travail, les organisations adoptent progressivement une approche de gestion des documents axée sur les risques.
Cette approche est décrite par Victoria Lemieux, Ph. D., dans Managing Risks for Records29. Elle y présente deux approches d’évaluation des risques applicables à la gestion des documents et de
l’information :

  • La première approche est fondée sur les événements, comme celle utilisée habituellement pour recenser les risques dans les programmes de protection des documents essentiels;
  • La deuxième approche est fondée sur les exigences propres aux documents et à l’information.

Dans la deuxième approche, Mme Lemieux suggère de fonder l’évaluation du risque sur la valeur que les documents représentent pour l’orientation stratégique globale de l’organisation plutôt que sur les événements. Il en résulte une approche interfonctionnelle, multidisciplinaire de la gestion des risques informationnels. Elle suggère d’intégrer l’administration des risques liés à la gestion des documents et de l’information à la fonction globale de gestion des risques, c’est-à-dire à tous les aspects de
l’organisation, que ce soit au niveau des activités, de la formation, du développement stratégique ou de la préparation du budget, plutôt que d’en faire une activité distincte et autonome du programme de gestion des documents. Le livre de Mme Lemieux expose en détail les deux approches et fournit des exemples des conséquences que peut entraîner une gestion inadéquate des risques liés aux documents et à l’information. Le tableau suivant énumère certaines de ces conséquences :

À l’instar des autres composantes du cadre de la gouvernance de l’information que nous avons abordées jusqu’ici, la gestion des risques fait partie intégrante des activités et des ressources (telles que l’ouvrage de Victoria Lemieux) qui sont disponibles pour faciliter la transition d’une vision traditionnelle centrée sur les documents essentiels et leurs risques à une approche axée sur les risques globaux de l’organisation. Les professionnels de la gestion des documents et de l’information doivent comprendre ce concept afin d’être capables de prendre part aux échanges au sein d’une équipe interfonctionnelle.

 

L’évolution des compétences

Alors, comment pouvons-nous déterminer les connaissances dont nous avons besoin pour réussir en tant que professionnels en gestion des documents et de l’information, ainsi qu’en gouvernance de l’information, si notre environnement change constamment? Lorsque la profession de gestionnaire de documents a commencé à changer dans les années 1990, on a distinctement ressenti le besoin de préciser les activités associées à cette profession. L’étude des codes de classification des professions du gouvernement fédéral démontre qu’il n’existait pas de catégorie distincte pour les gestionnaires de documents parce que la profession elle-même n’était pas vraiment bien définie. Les avocats, les médecins et les dentistes possédaient des ensembles de compétences et d’aptitudes bien définis, mais rien de tel n’existait à cette époque pour les gestionnaires des documents et de l’information en Amérique du Nord.

 

Modèle de compétences de l’AABDG

En 1994, l’ARMA Canada a participé à un projet mis de l’avant par Développement des ressources humaines Canada (DRHC). Intitulé l’Alliance des archives, des bibliothèques et de la gestion des documents (AABGD)30, ce projet regroupait des représentants issus de trois domaines professionnels et avait pour but d’examiner les défis vécus par le secteur des ressources documentaires31 en matière de développement des ressources humaines. Dans la Trousse de travail des spécialistes en gestion de ressources d’information dans les domaines des archives, de la bibliothéconomie et de la gestion documentaire32, l’AABDG est décrite comme étant :

« unique parce qu’elle regroupe trois domaines professionnels (bibliothéconomie, archives et gestion des documents). De plus, elle a commencé à démontrer les avantages qu’apporte la collaboration entre ces trois professions en cernant leurs besoins communs en ressources humaines et en y répondant [traduction]. »

Les groupes se sont réunis pendant environ cinq ans et ont créé non seulement un ensemble de compétences détaillées, mais aussi des trousses de travail pour appuyer l’utilisation de ces compétences. L’ensemble de compétences défini par l’AABGD33 comprenait sept compétences professionnelles soutenues par trois ensembles généraux d’habiletés et il définissait les activités clés de la gestion des ressources documentaires. Les membres des comités représentant les trois professions ont convenu que les bibliothécaires, les archivistes et les gestionnaires de documents:

  • Conçoivent et maintiennent des programmes et des services;
  • Acquièrent des ressources documentaires et en disposent;
  • Élaborent un cadre d’accès aux ressources documentaires;
  • Fournissent des services de référence et de recherche ainsi que des conseils;
  • Fournissent des services par voie électronique;

Conservent et protègent les ressources documentaires. Ces professionnels doivent aussi posséder :

  • Des habiletés en administration et en gestion;
  • Des habiletés en relations interpersonnelles;
  • Des habiletés personnelles.

L’ensemble de compétences de l’AABDG était utilisé pour recruter, sélectionner, former et évaluer le personnel en plus de soutenir l’élaboration des programmes de formation en gestion des documents et de l’information. Ces compétences étaient sans doute en avance sur leur époque en raison de la collaboration existant entre ces trois groupes professionnels.


Étions-nous les seuls à élaborer des modèles de définition des compétences? Sûrement pas. Étions-nous des chefs de file dans l’application de ce processus? Oui, nous l’étions.

 

Modèles de compétences de l’ARMA International34

Le premier ensemble de modèles de compétences élaboré par l’ARMA International en 2007 se concentrait sur les compétences en gestion des documents et de l’information. Il différait des
compétences définies par l’AABDG du fait qu’il divisait les compétences en quatre niveaux, depuis le praticien débutant jusqu’au cadre supérieur, et en six domaines correspondant :

  • Aux fonctions de gestion;
  • Aux pratiques en gestion des documents et de l’information;
  • À la gestion des risques;
  • Aux communications et au marketing
  • Aux technologies de l’information;
  • Au leadership.

Pendant la période marquant la transition de la gestion des documents à la gestion de l’information,
l’Office des normes générales du Canada (ONGC) a publié la norme CGSB-192.2-2009 : Compétences des membres de la collectivité de la gestion de l’information d gouvernement fédéral, dont le format diffère de ceux de l’ARMA et de l’AABDG. Au moment de la rédaction du présent article35, une révision de la norme avait été proposée, mais n’a pas été entreprise par manque d’intérêt. Cette norme peut encore être consultée sur le site Web de l’ONGC.

L’ARMA International a entrepris d’élargir son offre de services à ses membres et d’y inclure la
gouvernance de l’information. L’organisme a conçu le programme d’agrément Information Governance Professional (IGP
professionnel en gouvernance de l’information) en 2012. L’ARMA International
décrit le rôle d’un professionnel en gouvernance de l’information comme suit :

« Un professionnel agréé de la gouvernance de l’information (IGP) conçoit et supervise des programmes destinés à régir les actifs informationnels de l’entreprise. Il établit un partenariat avec l’entreprise pour favoriser l’innovation et lui donner un avantage concurrentiel, tout en assurant l’harmonisation stratégique et opérationnelle des divers objectifs de l’entreprise (activités, affaires juridiques, conformité et technologies). Il supervise un programme qui
soutient la rentabilité, la productivité et l’efficacité de l’organisation et qui la protège [traduction] »

À l’appui du titre professionnel IGP, l’ARMA International a élaboré un ensemble de compétences qui complète sa liste initiale de compétences en gestion des documents et de l’information et qui énonce les habiletés que doit posséder le titulaire de ce titre, à savoir la capacité à:

  • Gérer les risques liés à l’information et la conformité;
  • Élaborer un plan stratégique de gouvernance de l’information;
  • Élaborer un cadre de gouvernance de l’information;
  • Établir un programme de gouvernance de l’information;
  • Réaliser l’intégration opérationnelle de la gouvernance de l’information et à la superviser;
  • Harmoniser la technologie au cadre de la gouvernance de l’information.

En ce qui concerne la définition du contenu d’un éventuel programme en gouvernance de l’information, cette liste de compétences offre un aperçu des activités qu’aurait un tel programme en énumérant les domaines abordés ainsi que les connaissances et les habiletés requises. Ces compétences peuvent aussi aider les professionnels en gestion des documents et de l’information à déterminer leurs besoins en perfectionnement professionnel, à élaborer un cheminement de carrière et à découvrir des possibilités de formation.

Je crois personnellement que la gouvernance de l’information constitue une réponse à un environnement de travail en évolution, qui a été fortement transformé par les technologies et qui exige une collaboration entre plusieurs groupes pour assurer la cohérence de la gestion documentaire et réduire la duplication des efforts. Les ensembles de compétences des professionnels en gestion des documents et de l’information se sont enrichis et étendus, tout comme ceux d’autres spécialistes dans
les domaines du droit, de la protection de la vie prive, de la gestion des risques et des technologies. Tous doivent collaborer pour apporter des solutions à l’échelle de l’organisation parce que les enjeux actuels dépassent largement le cadre des responsabilités de chacun des services de gestion.


Regard vers l’avenir : rôles et responsabilités

En examinant la gouvernance de l’information dans sa globalité, il est clair que les actifs informationnels en sont l’élément commun et qu’il faut un cadre de gouvernance de l’information pour garantir, entre autres, le respect de la conformité. Toutefois, contrairement à la gestion des documents qui a traditionnellement été la chasse gardée d’un groupe de personnes désignées, les enjeux liés à la gouvernance de l’information sont plus complexes et exigent l’apport de différents groupes, apport qui variera selon les besoins et les préoccupations exprimés :

  • Le responsable de la protection de la vie privée s’assure que les renseignements personnels sont recueillis, utilisés et éliminés conformément à la législation en vigueur.
  • Les services juridiques ou le cabinet d’avocats vérifient que les clients internes et externes sont informés des enjeux liés à la conservation et à l’administration de la preuve électronique. Au cours des dernières années, l’augmentation des coûts associés aux recherches de données nécessaires pour soutenir les recours en justice a contribué à sensibiliser davantage lesintervenants aux avantages qu’offre la mise en place d’une gouvernance de l’information efficace dans le cours normal des activités de l’organisation.
  • Le service des TI surveille tous les systèmes ainsi que les données qu’ils produisent et qui y sont conservées. Il assure leur sécurité et leur accessibilité tant et aussi longtemps qu’il le faut au moyen de la gestion active des données et de méthodes de conservation et de préservation numériques qui tiennent compte des changements en matière de logiciels et de matériel.
  • Les responsables de la gestion des documents et de l’information fournissent des lignes directrices, des normes, des politiques et des procédures, et veillent à ce que l’information soit gérée adéquatement depuis sa création jusqu’à son traitement final.
  • Les employés sont maintenant beaucoup mieux informés du rôle des actifs informationnels et des répercussions de la technologie. Ils s’attendent toutefois à ce que la gestion de ces actifs soit transparente afin qu’ils puissent obtenir ce dont ils ont besoin pour effectuer leurs tâches quotidiennes. Toute mesure qui fait de la gestion de l’information un fardeau pour l’utilisateur sera rejetée.

Dans le même ordre d’idée, voici une matrice RACI (responsabilité des exécutants, autorité du dirigeant, consultation avec les conseillers et information des parties dépendantes) qui a été préparée par l’Information Governance Initiative à partir des commentaires que l’organisme a reçus durant son enquête :

 

Les responsabilités d’une personne au sein d’une organisation dépendent du type d’organisation où elle travaille ainsi que des objectifs opérationnels stratégiques, des activités et de l’environnement réglementaire de l’organisme. Comme mon père le disait souvent : « si vous savez où vous allez, il y a plus d’un chemin qui vous y mènera ». C’est la même chose en gouvernance de l’information. Cependant, la réussite d’un programme dépend aussi des efforts déployés par ses champions et l’équipe interfonctionnelle qui en est responsable, et ces personnes doivent être soutenues par des outils et des technologies appropriées.

 

Conclusion

Il y a quelques années, lors d’une conférence de la National Association of Government Archives and Records Administrators (NAGARA) tenue à Sacramento, en Californie, l’animateur d’une séance d’information a suggéré aux archivistes et aux gestionnaires de documents de modifier leur discours afin que les auteurs des documents cessent d’entrevoir leur secteur d’activité comme un milieu parsemé d’embûches et de jargon et le perçoivent plutôt comme un secteur attirant auquel ils pourront s’identifier. Nous sommes sûrement sur la bonne voie!

Il n’y a aucun doute que la gouvernance de l’information, quel que soit le nom qu’on lui donne, sera appelée à jouer un rôle essentiel dans l’élaboration de la position stratégique de l’organisation en raison de l’importance des enjeux liés aux actifs informationnels de l’organisation, c’est-à-dire la protection de la vie privée, l’administration de la preuve électronique, la gestion des risques liés à l’information, le respect de la conformité et la valeur opérationnelle de l’information.

La gestion des documents n’est pas un phénomène du passé. Elle a simplement changé. Les organisations ont encore besoin de gérer leurs documents puisqu’ils constituent et justifient leurs activités et leurs transactions. Pour que les documents (ou les données, l’information, le savoir, etc.) soient utiles à l’organisation, ils doivent être gérés tout au long de leur cycle de vie. Le débat pour définir ce qu’est la gouvernance de l’information et qui en est responsable va se poursuivre tandis que nos fonctions continueront d’évoluer. Chaque organisation concevra et mettra en œuvre des programmes fondés sur son orientation stratégique, ses ressources et ses exigences individuelles en matière de gestion des risques et de conformité. Le principal changement à effectuer consiste à former des équipes interfonctionnelles en vue de résoudre les problèmes au moyen d’une approche globale, à l’échelle de l’organisation, au lieu d’une approche compartimentée, au niveau de chaque service.

Nous disposons aujourd’hui d’un riche ensemble de travaux qui ont été réalisés au Canada par des professionnels de la gestion des documents qui ont réussi à surmonter progressivement les défis posés par les nouvelles technologies. Nous avons acquis une solide tradition comme chefs de file en gestion des archives, des documents et de l’information. Nous devons tirer des leçons de cette tradition et les utiliser à notre avantage. Nous devons également prêter attention aux recherches en cours pour déterminer nos orientations futures et nous préparer aux défis à venir. Nos collègues canadiens continueront de tracer la voie de différentes façons et nous aideront à parfaire nos connaissances professionnelles.

Comme je l’ai mentionné au début de cet article, il ne s’agit pas d’une étude exhaustive sur ce sujet et j’ai sûrement oublié d’aborder certains thèmes ou de présenter certaines personnes. Je m’en excuse. Il reste beaucoup de matière à couvrir. Je vous invite donc à préparer un article qui pourra être publié prochainement par l’ARMA Canada. J’espère que vous serez nombreux à prendre la relève et que vous contribuerez, vous aussi, à rendre le futur visible!

 

Annexe A : normes ISO

Technologie de l’information

ISO/IEC 20000-1:2011 Technologie de l’information Gestion des services Partie 1 : Exigences du système de management des services
ISO/IEC 27014:2013 Technologie de l’information Techniques de sécurité Gouvernance de la sécurité de l’information [en anglais seulement]
ISO/IEC 38500:2015 Technologies de l’information Gouvernance des technologies de l’information pour l’entreprise [en anglais seulement]

Gestion de l’information

ISO 30301:2011 Information et documentation – Systèmes de gestion des documents d’activité – Exigences
ISO 15489-1:2001 Information et documentation « Records management » Partie 1 : Principes directeurs [en révision]
ISO 16175-1:2010 Information et documentation Principes et exigences fonctionnelles pour les enregistrements dans les environnements électroniques de bureau Partie 1 : Aperçu et déclaration de principes [en anglais seulement]
ISO/TR 17068:2010 Information et documentation Référentiel tiers de confiance pour les enregistrements électroniques [en anglais seulement]
ISO/TR 18128:2014 Information et documentation Évaluation du risque pour les processus et systèmes d’enregistrement
ISO 23081-1:2006 Information et documentation Processus de gestion des enregistrements – Métadonnées pour les enregistrements Partie 1 : Principes
ISO/TR 26122:2008/Cor 1:2009 Information et documentation Analyse des processus pour le management de l’information et des documents [en anglais seulement]

 

Notes en fin de texte

Les adresses URL suivantes ont été vérifiées le 15 février 2016.


1L’Atlas canadien en ligne http://www.canadiangeographic.ca/atlas/themes.aspx?id=connecting&sub=connecting_technology_wireless&lang
=Fr
2Voir Gil Press: Forbes –
http://www.forbes.com/sites/gilpress/2013/05/09/a-very-short-history-of-big- data/#3ad2a6f555da
3http://www.emc.com/collateral/analyst-reports/idc-digital-universe-are-you-ready.pdf

4http://www.emc.com/leadership/digital-universe/2014iview/index.htm

5
https://www.igt.hscic.gov.uk/Home.aspx?tk=424106365366405&cb=7e8b504b-fbb7-488d-aefd- a8894cfb45b1&lnv=7&clnav=YES
6Prendre note que la société Gartner, comme beaucoup d’autres organisations, utilise le terme « archivage » pour désigner le transfert des documents inactifs dans des lieux ou sur des supports moins dispendieux. Pour les personnes travaillant dans ce domaine, l’utilisation de ce terme porte à confusion lorsqu’il s’agit de déterminer quelle est l’infime partie (5 %) de l’ensemble des documents d’une organisation du secteur privé ou public qu’il faut conserver à long terme pour constituer sa mémoire institutionnelle.
7TSC a été fondé en 1997 par Richard G. Braman. Il s’agit d’un organisme sans but lucratif, constitué en vertu de l’article 501(c)(3), qui se consacre à la recherche, à l’enseignement et à l’étude avancée du droit et des politiques dans les domaines de la législation antitrust, des lois complexes et des droits de propriété intellectuelle.
8Images illustrant le cadre de la gouvernance de l’information

9www.edrm.net

10William Benedon a été président de l’American Records Management Association et le rédacteur en chef du Records Management Quarterly. Il a reçu le prix Emmett Leahy en 1968 pour sa contribution exceptionnelle en gestion des documents.
11Prentice-Hall, Inc. 1969
12Un terme popularisé dans les années 1980 et 1990 lorsque les ordinateurs personnels se sont répandus dans les milieux de travail.
13Comme dans d’autres programmes des Archives nationales où les questions entourant la préservation des données à des fins de référence historique et de recherche constituaient une préoccupation majeure, en plus de la préservation des documents historiques sur papier.
14“Information: To Share Or Not To Share? The Information Governance Review”

15http://www.tbs-sct.gc.ca/pubs_pol/dcgpubs/tb_h4/holdings-fonds04-fra.asp

16Le Dr Vale est plus tard devenu le chef de la Direction de la gestion de l’information du gouvernement de l’Ontario.
17http://www.im.gov.ab.ca/documents/imtopics/IMFrameworkSummary.pdf
http://www.im.gov.ab.ca/documents/imtopics/IMFrameworkReport.pdf
18http://www.oecd.org/fr/sti/ieconomie/lignesdirectricesregissantlaprotectiondelaviepriveeetlesfluxtransfrontieres dedonneesdecaracterepersonnel.htm
19Correspond aux principes énoncés dans le document Lignes directrices régissant la protection de la vie privée et les flux transfrontières de données de caractère personnel qui a été publié par l’Organisation de coopération et de développement économique (OCDE) et qui a été entériné par le Canada en 1984.
20Cela comprend les documents contenant des renseignements personnels.
21Le concept a été élaboré par Ann Cavoukian, Ph. D. afin « d’intégrer la notion de protection de la vie privée au cœur même des technologies [traduction] ». https://www.ipc.on.ca/french/privacy/introduction-to-pbd/default.aspx
22https://www.ipc.on.ca/english/Resources/Discussion-Papers/Discussion-Papers-Summary/?id=329

23Remplace la norme ISO 17799
24Provenance Systems a été fondée en 1989.
25Publié en 2001 par Open Text Corporation. Le nom Open Text est devenu OpenText (un seul mot) depuis que ce livre a été publié. L’auteure a utilisé le nom qui était en usage à ce moment-là.
26Susan Wortzman est la fondatrice de Wortzmans, une société qui est établie à Toronto, en Ontario. http://www.wortzmans.com
27https://www.google.com/search?sourceid=navclient&aq=&oq=Sedona+Canada+Principles&ie=UTF-8&rlz=1T4AURU_enCA499CA500&q=sedona+canada+principles+addressing+electronic+discovery&gs_l=hp..1.0l2j0i22i30l2.0.0.0.5630047. 0.zGJBBlnet3o
28EDRM regroupe 400 organisations, dont 195 fournisseurs de services et de logiciels, 88 sociétés, 76 cabinets d’avocats, 24 organismes gouvernementaux, 12 établissements d’enseignement et 5 groupes industriels, qui sont concernées par les questions de gouvernance de l’information et d’administration de la preuve électronique.
29Lemieux, Victoria. Managing Risks for Records and Information, Lenexa, KS; ARMA International, 2004
30Le comité était formé de membres de l’ARMA International, du CCA, de l’AAQ, de l’ACA et d’autres associations représentants différents types de bibliothèques de l’ensemble du Canada.
31Le titre « secteur des ressources documentaires » a été choisi dans le but de refléter les éléments communs de ces trois domaines professionnels (bibliothèques, archives et gestion des documents).
32Publié en août 2002 par le Conseil des ressources humaines du secteur culturel
33http://www.culturalhrc.ca/heritage/f/01-01-00.php
34
http://www.arma.org/r1/professional-development/education/competencies
352016-02-02

A CONTENT ANALYSIS OF INFORMATION IMPACT: PROFESSIONALISM or NOT – A CRITICAL TWENTY-FIVE YEAR REVIEW

By: John Bolton, MA, MLS

Introduction:

During the late 1980’s and into the early 1990’s, the field of activity known as Records Management, later to become Records and Information Management or RIM, experienced a debate concerning whether or not RIM was, or could be at some later time a “profession,” similar to that of groups such as doctors, engineers or lawyers. This debate manifested itself in the literature of the time by asking and responding to some question along the lines of: “Is RIM a Profession?” That literature defined the term “professional,” outlined what fundamental requirements were necessary to qualify a field of study as professional, and did as well try to identify where records management was, at that point in history, along the road to achieving the status of being a true profession (see Pemberton and Pendergraft, 1990). Moreover, from that literature a reader could be lead to believe that RIM, if not exactly a profession at that point in time, was at least close to reaching that goal. Plus, while no firm timeline for the goal was stated, that literature talked vaguely in terms of years or a few decades for the goal to be achieved.

An example of how significant the term “professional” had become by the late 1980’s period can be seen by examining the January, 1988 issue of the Records Management Quarterly published by the Association of Records Managers and Administrators (ARMA International or ARMA). That particular volume issue contained the “Cumulative Index to the Records Management Quarterly 1967-1987,”
which included a listing of all their published articles by subject matter. On examination, while there was no specific subject heading for the term “Professionalism,” the index did reference thirty articles under
the term “Professional Organizations,” as well as, forty-one articles under the “see also” subject term
“Information Professions.” At first glance these numbers may not seem significant however, to counter them the very important records management subject terms of “Disposition” referenced only nine articles, while the term “Vital Records Management” referenced only twenty-four articles.

This example is not offered as overwhelming proof that the records management world at that time was deeply embroiled in some identity struggle, but rather as just a simple mechanism to point at where some of the “thinking” was during that period. Pemberton (1993) said that a discipline was a field of study and a profession was something elevated by society. So, given a duration of over twenty-five years has passed since the late 1980’s, the question is, “Did it happen?” Did RIM move forward? Did records management or RIM, reach the lofty goal of a true “profession?” One that is equal in public recognition, with that of an engineer or physician. This paper offers an attempt at addressing these questions through an in depth examination of information content published by ARMA.

Background:

During that late 1980’s the author both joined the ARMA International organization, and became interested in the topic of whether or not RIM could reach the goal of a true profession. At that time, the author worked for a civil engineering company as their Librarian/Records Manager. While in that position the company bid on a large and complex engineering opportunity. That opportunity was to require adherence to the ISO 2000 Quality Control standards which were very specific concerning documentation, its capture, control and maintenance. In bidding for that work, the company applied for some insurance, and added the author’s name and qualifications (a Master’s degree in Library and Information Studies, plus partial completion of the CRM designation requirements), along with various kinds of engineers, as the professionals who would carry out all the work related duties during the project. While all the engineers and the company accountant were accepted under the terms of the insurance underwriter, it was to the embarrassment and chagrin of the author that he was refused insurance because his education and CRM credentials did not quality in what the insurance people considered to be a “professional.” This factor was not the reason why the company did not win the project but it did cause the company to view their Librarian/Records Manager as “office” staff, rather than part of the organization’s “professionals.”

This bit of personal history is told to help “set the stage” so to speak. The author would believe that over his career he was a professional. That he conducted himself as a professional, and produced professional level work. Further, having achieved the CRM designation, and managed to have some of his work published, the author, like many of his RIM colleagues, had a deep interest in seeing his chosen field of endeavor attain that lofty goal of a true profession. The experience became a personal driver to try and help the RIM profession reach its goal. Thus, this personal experience forms some of the background to why this work was undertaken.

Beyond that personal story, there were four other drivers behind this paper. Or, said differently, there were four questions that the author felt were unanswered, and shouldn’t be. These four questions are as follows.

Question 1: Why would people working in RIM feel that the field of study should have a “peer- reviewed journal” for the publishing of scholarly works; but, that they would gain no particular benefit from the availability of such a professional journal?

This question gains its origin from an ARMA International Education Foundation research survey (see Force and Shaffer, 2013). That survey asked ARMA members for their opinions concerning whether or not RIM would should have a peer reviewed journal. While the survey found data to support the idea that a peer-reviewed journal should exist, it also found the disturbing news that those surveyed also felt that they would personally gain nothing from the existence of such a professional journal. These two findings appear to be at opposite ends of the pole, and pose more questions around why. This work is an attempt, to a degree, to take a kick at addressing the AIEF findings.

Question 2: Why did a RIM audience seem shocked at the use of content analysis as an approach to analyzing published RIM materials? Or, stated differently, had the RIM audience ever been exposed to this level of professional media analysis, and if not, why?

In April 2014, the author gave a presentation to a RIM (ARMA Vancouver Chapter) audience in
Vancouver, British Columbia. That presentation was developed to speak to the topic of the “Information Landscape.” As part of the presentation a graphic was offered which the audience found interesting and to a certain extent shocking. The graphic identified some characteristics of shifting content type found in ARMA RIM publications. While the specific graphic does not bear on this work, the approach used in developing the graphic does. That approach was “content analysis,” a mechanism used in analyzing media. Since the Vancouver audience seemed taken aback by the graphic produced via content analysis, the author was left wondering why.

Question 3: What has happened to RIM? Or, what was/is the status of RIM in its march towards reaching the goal of a profession?

The third question concerns “time and status.” As stated above during the 1980’s and 1990’s, RIM experienced a discussion on the topic of whether or not the goal of “profession” was or could be achieved. Given that a period of a quarter of a century had passed one was left to wonder, what happened? If RIM achieved the goal, had there been some sort of announcement? And if we hadn’t made it; why not? Since the author believed he had been paying attention, no announcement came to mind. Thus, while the word “professional” was certainly being used in the RIM industry and literature, the author wondered if that community had been under some sort of false impression. This idea of a false impression was prickly to accept, so the outstanding question(s) needed to be answered.

Question 4: Given what the author understood and had experienced over twenty-five years, he was unsure about RIM’s status as a profession, and in fact, was of the impression that the boat not only hadn’t sailed, it had never really left the dock. So, was the author right or wrong in this impression?

Also as background to this work a need exists for an explanation about what the nature of a profession is, and how one gets that status. According to Greenwood (1966), certain elements are necessary to distinguish professionalization. These include: the development, presence and research for new Systematic Theory; a sense of Autonomy, i.e. only a doctor is trained to perform surgery; Community Sanction, i.e. doctors are recognized everywhere as being highly educated and dedicated to their Hippocratic Oath to provide healing to the sick; that a Code of Ethics exist which members adhere to, and which some “professional body” oversees, monitors and deals out penalties where breach of code is found; and last a Culture needs to exist, i.e. the body should have a language which it uses commonly, an approach to how it works, plus an attitude to how it operates and presents itself. Millerson (1964) felt that a profession could also be described by saying it was a non-manual occupation, had a recognized occupational status, had a well defined area of study or concern, provided a definite service, and came, to individuals, after advanced training and education. Here, for this work and for RIM people in general, this idea of an advanced training and education are key factors. Here is why.

Greenwood (1966) also suggested that, “because understanding of theory is so important to professional skill, preparation for a profession must be an intellectual as well as practical experience.” And he went on to say, “orientation in theory can be achieved best through formal education in an academic setting.” So from this we are led to understand that education to a higher degree level is basically a prerequisite for becoming a professional. But where is this “higher degree level” of education in our RIM world?
Frankly, it basically does not exist. Clearly, undergraduate and graduate (Masters and in some cases Doctoral) level degrees do exist, but they are not in RIM. Rather those degrees are in Library Science (sometimes called Library and Information Science), Archival Studies, and degrees in Information Technology or Computer Sciences.

Berenika Webster (1999), wrote in the ARMA Information Management Journal that, “Records Management is undergoing the process of professionalization by acquiring some of the (needed) characteristics, i.e. formal education to a degree level, existence and strengthening of professional organizations, foundation and development of professional literature, increased research activity supplying the discipline with new theoretical frameworks, and new knowledge to deal with issues of technological development.” While Webster’s words made for good positive oriented reading and which could have lead a reader at the time to believe that movement towards professionalization was being made, things were happening and the goal of becoming a profession was within reach, the examples in support of her argument were actually weak. The existence and availability of formal advanced degree programs she mentioned did not really exist beyond a couple of remote examples (i.e. UK and Australia). Further, while she offered the existence of ARMA, as well as some other RIM oriented organizations those bodies weren’t in fact “strengthening.” For example, ARMA’s membership has not really increased over the last twenty-five years, and the Records Management Association of Australia (RMAA) joined with a smaller Southeast Asian group to help remain operational. There is no question that the body of RIM literature has increased, but where, regardless of the AIEF and its work, is the body of research activity supplying the discipline with its own distinct new theoretical frameworks that were mentioned? Plus, while it is agreed that new knowledge has come along to deal with issues of technological development, RIM was likely not the primary supplier of that knowledge. Therefore, where was this advancement, this process towards professionalization that Webster spoke of?

When Webster wrote her 1999 article she credited Dr. J. Michael Pemberton (with Lee O. Pendergraft, 1990) for his early work published by ARMA where he described what it took to be a profession as well as describing what RIM would need to accomplish to reach such a goal. But, in 1990, some nine years before Webster’s article was published, Pemberton spoke at the ARMA conference in San Francisco. At that time he, “criticized RIM practitioners dislike of matters theoretical, and claimed that without theoretical foundations, there could be no meaningful research effort, and without research we have only hearsay, conjecture, anecdote, and possibly propaganda.”

These questions, along with the author’s personal experiences and interests, formulated the background leading to the research outlined in this work. Ultimately, the reporting of the findings and conclusions from the research was conducted via a PowerPoint presentation by the author at the ARMA Canada conference, held in Calgary, Alberta, Tuesday, May 26, 2015, Session T-24 (see Bolton, 2015). That presentation to a group of ARMA Canada members raised an interest by some of those people in taking
some action that would bring about change. Specifically, their interest was aimed at improving the level of Canadian authorship and content through some mechanism. The mechanism subsequently formulated was to be a website, (to be) hosted by ARMA Canada, providing a place where future scholarly RIM oriented materials could be posited, cited and referenced by future users.

Methodology:

The methodology applied in conducting the research discussed in this work had four components. These included a background search of relevant RIM literature, gathering data to support an investigation of the outstanding questions, analysis of the gathered data, and finally the identification and formulation of observations, findings and conclusions resulting from that analysis. The original reporting of those observations, findings and conclusions was, as mentioned above, conducted via a conference PowerPoint presentation leading eventually to this paper (see Bolton, 2015).

In reference to the author’s conduct of the literature search it is important for the author to acknowledge that the review of literature was not exhaustive. And, for the purpose of the original 2015 Calgary conference presentation, the author conducted only sufficient literature search to provide evidence and credence to support the theme of his presentation. At the time, an exhaustive literature search was unwarranted since the work was not undertaken to be exhaustive and or scholarly in the sense of post graduate effort. Nevertheless, the author did use the Internet to conduct a search of any posted work, and searched the ARMA publications for anything of a relevant or similar nature. To that end, not much of a similar nature was found. Which, to a certain extent addresses one of the outstanding questions concerning why a Vancouver RIM audience seemed a bit shocked with the approach presented. The literature search seemed to support their being shocked because they were so unfamiliar with the idea and/or application of content analysis in their RIM world. Certainly it was not because that Vancouver audience was uneducated and/or unsophisticated. It appeared to be just something new to them. It probably seemed weird and highly unusual to them, although, their eyes did seem to get bigger as the findings were discussed.

(NOTE: During the literature search the author did discover a research paper by Nelson Edewor [2013], which was a content analysis of a Nigerian Library and Information Science Journal. An interesting and somewhat similar type of research work, although of a reduced level of complexity.)

As for the data gathering and analysis approach performed, the author relied on the use of “content analysis” which is a form of bibliometrics. According to Krippendorff (1989), Content Analysis is, “a research technique for making replicable and valid inferences from data to their content.” While Elo and Kyngas (2008), considered content analysis to be, “an approach to distil words and communications into fewer content related categories or units.” Basically the concept is that the content of a larger work or works can be understood by first identifying its make-up or components, and then recording details about those components. In a library situation this is somewhat like separating various works by subject matter, sciences from history, or fiction from reference materials. In this way the Librarian can conduct counts, recognize storage requirements, understand usage from circulation data, and possibly recognize
gaps in a particular collection. In this case, content analysis is used to dive deeper into understanding the nature of the ARMA publication offerings via examination of its published content.
The information content that was reviewed, captured and later analyzed came from the three RIM oriented publications that basic paid membership in the ARMA organization provides. These are: The Records Management Quarterly (1988-1998), The Information Management Journal (1999-2008), and the Information Management Magazine (2009-2012). While ARMA has published other RIM oriented materials including such things as standards, books, guidelines, notes and seminars, these three specific publications are, the author would submit, recognized as the main RIM offerings to its membership.
Certainly, by their publication they have been the most regular offerings to the membership by the ARMA International organization.

Further, a twenty-five year span, of these publications was selected as a sample set of a size it was hoped would be reasonably sufficient to offer a glimpse into the content published. A smaller sample size was rejected because it was felt, by the author, to be insufficient to reveal any trends that might exist or have taken place over the years. A span longer than twenty-five years might have also been better at revealing any such trends; however, as it will be explained later, due to editorial changes, i.e., changes to the use of electronic issues and of not printing citations/references, extra burdens appeared that hampered exploring beyond the twenty-five year mark. Thus, the sample set for exploration started in 1988 and ended with the full 2012 publication volume issues.

As stated above, the questions under consideration in this work concerned the professionalization of RIM. To explore this idea using the content of the ARMA publications, it might have been possible to count every single appearance of the word professional, and/or similar derivations of that word, found in those publications. However, this approach would, without the use of a computer and electronic versions of all the publications, have been very onerous.

Even if all the occurrences of the word professional had been counted, it could only have been ranked against other RIM words if they had also been counted. This approach might have revealed information through their rankings and comparisons, and might have been very interesting. Such an endeavor could be of interest to a graduate student working on a doctoral thesis, but for this work it was much too involved. Nevertheless, some approach was required. Since the idea of this work was to explore whether or not the goal of professionalism had arrived in the field of RIM, it was considered that answers might be found in three other publication content areas. These areas included characteristics about the authors, the RIM subject content, and the advertising content. So to help in the gathering of content data a form (see Fig. 1) was created.

The first area of interest was gaining some understanding of the authors who were being published by ARMA. Who were these authors? Where did they come from? What credentials did they have? What work did they do, and who employed them in that work? Plus, how often were they being published? In gaining an understanding of the authors it was felt some knowledge might be gathered to show trends in the spread of RIM knowledge, the advancement of RIM education, the international nature of the ARMA membership and influence, as well as some indicator(s) of professionalization.

The next area of content consideration concerned what were those authors writing about which ARMA felt was worthy of publication. In exploring this content, data was gathered concerning specific RIM content subject areas of interest. In other words, what were those matters of RIM operations, procedures, policies, technology, management and scholarly interest that ARMA felt were important and of value to its membership? Here, without any analysis, it was felt, that the content should show a generally wide spread offering of subject matter with some short term trends due possibly from issues and/or changes significant at a particular time in history. Moreover, it was also hoped that the subject content would indicate some movement or direction that gave indication of professionalization. Here the idea wasn’t whether or not content had been prepared by the authors in a professional manner, or whether it appeared professionally published, i.e. it looked polished. The idea was, did the content show any indication that could be seen to clearly state, “this is work produced and published by a true profession.” Something that was very different from the tabloid offerings found at the supermarket checkout counter.

The last of the three areas concerned what else was ARMA publishing for its readers. This last area was considered to be advertising. Here the desire was to collect information to reveal not only what products and services were being advertized, but by who and what if any trends or inferences might be seen from that information. From the outset it was considered that products and services offered by various vendors and utilized by people working in the RIM field should stand out above any other content. This idea seemed reasonable because such products and services, i.e. filing/labeling equipment, storage equipment (i.e., boxes and shelving), technology hardware and software, shredding, off-site storage, disaster recovery and consulting services, were all pretty much every day factors in the life of any person working in RIM. Thus, it was felt these things should really show significance from examination of the data. It was also thought that through these things some evidence might appear in support of a movement towards professionalization; although, it wasn’t clear if or how this evidence might appear.

Ultimately, as the methodology for this work evolved a number of approach rules for data gathering were clarified. These were:

  • Authors included ARMA Editors.
  • Articles included: Editor’s comments and notices, letters to the Editor, research, case studies, reviews of books and other media, notices, awards, postings, RIM related comments, opinions and/or discussions.
  • Publication information such as publisher name and address, subscription fees, instructions to authors and advertising are not articles.
  • Advertising for laws/statutes/regulations were considered – LEGAL.
  • Standards also included guidelines, technical papers and best practices.
  • Only citations, not bibliography references were counted.
  • If an author cited a work more than once in the same paper, it was only counted as one citation source.
  • Notifications of events, training, conferences and/or seminars are counted as advertising.
  • Advertising that promoted ARMA were considered as EVENTS.

Analysis:

Authors by Gender: The RIM world is, by pure observation at any conference, composed mostly of women. The author has no hard facts on how this observation would be supported by a percentage; however, again by pure observation coming from over twenty-five years working in the field, the estimate would be at least a 75/25 split between females and males participating in RIM. The female side might even be higher than the 75 percent. So, given this observation, it is revealing that the content data found the percentage of male authors to be 63 percent over that of 37 percent for female authors (see Fig. 2). While striking, this finding is almost a complete reversal of the percentage of actual RIM participants. The finding doesn’t necessarily pose any concern, except that it does seem interesting. One can only surmise that the males of RIM industry are possibly more vocal in expressing their views.

Historically, RIM has been a business of women file clerks and assistants. Managers were typically, or at least often male. Some of those males were prolific authors who worked very hard at describing the world of RIM and teaching people about it. A few of those males, such as Ira A. Penn, were strong voices in our business who spoke often and with conviction. Their words and writings helped foster interest in RIM and build the body of literature. Now, history, as we know in the early 2000’s has changed somewhat. The field has many more female managers, teachers, consultants and experts. Given this, it might have been expected that the data show some trend of growing strength in female authorship building up through the years of the survey. But, no such trend appeared. The data shows an improvement in the female authorship from about 2002, but no definitive shift between the number of males and females. So, does this reveal that RIM is dependent on gaining its insight mostly from males?

It would appear, from the content data that some “balance of power” between the males and females developed within the RIM world. This may seem to be a grandiose statement; nevertheless, the numbers over the twenty-five year time span are pretty consistent, thus showing a balance of a kind. The females of the industry may not be happy with this finding, but there it is. Maybe real gender change in the RIM field has yet to take place? Maybe it will only take place at some later date? Maybe it won’t change at all? Only time, and/or effort extended by the female participants in RIM will tell. As for the males, considering they form a small segment of the population, they seem to be doing quite well.

Authors by Country: It is not surprising that the content data shows that 88 percent (see Fig. 3) of the authors came from the United States of America (USA). ARMA does have members from around the world, but its main membership has always been from the USA. Canada has been considered a “region” of the larger ARMA organization, and “ARMA Canada” has acted to a certain degree as an autonomous group within ARMA, running its own Canadian national conference on an annual basis. Canadian membership in ARMA has been recognized as being approximately one tenth of the whole. (NOTE: A membership figure of 1650 was provided by ARMA Canada, September 15, 2015.) So, if ARMA’s total membership was about ten thousand, and Canadians made up a tenth or one thousand of those members, we could reasonably expect that our authorship percentage should fall at or near the ten percent mark. The data however showed that this is not the case, as Canadian authorship was barely 6 percent, only slightly beating out that of the United Kingdom/Europe at 4 percent. (At a Canadian membership level of 1650, the expected mark should be about 16%, and not the dismal 6%. Regardless, as the numbers show, the overwhelming voice of America has and continues to be staggering.)

In Canada, while the bulk of the population lives fairly close to the border with the USA, there is a general belief that life is different, and that the way things are done is different. Given recognition of this general feeling, it might have been thought the percentage of Canadian authorship would be somewhat higher, as those Canadian authors outlined a “Canadian approach” or discussed a “Canadian issue” of interest and benefit to the Canadian ARMA members. Given that the data shows such a low participation by Canadian authors, possibly no such Canadian approach or issues were present. Or, was it that Canadians weren’t authors of anything worthy of publication by ARMA, or that they just were too busy to bother with offering their advice, opinions and efforts. Again, some strong words, but if Canadians do not want to be dominated by their USA cousins, then it would appear they need to be doing something more.

CRMs and PhDs (Professionalism Factor): In 1993, Ira Penn wrote that, “Less than five percent of all the practitioners in the field have chosen to become CRMs. Of those who are CRMs, a considerable number have complained about having to keep up-to-date to remain certified. Although the CRM program has been in effect for some 17 years, a significant percentage of those who have purported to be leaders of the profession during that time have neither pursued the credential nor encouraged others to seek it.” One might imagine that such a statement coming from one of the unquestioned leaders in the RIM field would surely have raised alarms, and caused a stir, especially if RIM was experiencing a movement towards professionalization.

Thus, one of the interests of this work concerns whether or not the presence of advanced education had increased over the twenty-five year period as Webster’s work suggested that it should. Further, there was also an interest in finding some proof that RIM had experienced a movement toward professionalization through its membership gaining that advanced education in the form of doctoral degrees and at least the Certified Records Manager (CRM) designation. At the outset of this work it was felt that surely if RIM had moved along the road of professionalization that movement might be verified through the credentials claimed by the authors being published. It was felt the data should show some upward trend in these two credentials being stated by the authors. Clearly, such a trend if present could have been evidence that more and more RIM workers had made the effort to gain those credentials probably for several factors such as a desire to gain a higher education, a desire to be professional, as a mechanism to gain recognition, or even because those higher credentials were required to obtain employment. Unfortunately, no movement was evident in the data. In fact, (see Fig. 4) the percentage of doctoral degrees held by authors was only 18 percent, and the CRM designation was only at 32 percent.
It should be noted here that of the recorded 1257 authors, many were repeats. So, while any particular survey year might have four authors with PhDs, in fact they could have been the same author (Pemberton for example). Each published presence of a particular person’s name as an author, was recorded for this work. Those recordings totaled to 1257. The percentages of 18% PhDs and 32% CRMs are derived from the 1257 total. If this work had actually eliminated all the duplication amongst the authors, the final percentages would be much lower. (For example, the adjusted number for CRMs would be approximately 10%.)

With such findings as these no movement toward professionalization could be found in the data. Webster was not correct in her implications for the future. For the most part the level of authors holding CRMs and/or PhDs over the twenty-five year span was steady. What information can be determined from this is unknown without other research activity. However, one might consider that the wider body of RIM workers did not see the need for such advanced education. It would also seem that the credentials did not become a requirement for employment, nor for publication, at least in the ARMA offerings. Further, the low level of PhDs suggests that there was also a low level of research work being done leading to the development of new theoretical advancements.

(NOTE: To truly understand why advanced degree programs in RIM are basically not available one must read the literature of the 1980’s and early 1990’s coming from the Library and Archival schools.
Fundamentally, the educators at those graduate schools argued that records management work was just an extension of archival work. That if a student received a foundation in RIM knowledge, via a single general introductory course; they could gain all the other skills and knowledge of records and information management through a graduate program of either Library Science or Archival Studies.

Thus, while this may have been protectionism of the existing programs, such thinking pretty much killed the development of full advanced degree curriculum in RIM. For further reading on this, see for example: Eugenia K. Brumm (1992) and Tyler O. Walters (1995).)

Authors by Type of Work: When an author has their work published it is typical that a small biography about that author is usually also published. In approximately fifty words those biographies usually list for example, the author’s credentials, their RIM background, the positions they’ve held within ARMA, the number of years of ARMA membership and they self identify what their daily work is. At the outset of this work it was felt that the collection of this employment information, of a non-privacy impact nature, should offer a glimpse at which sectors of the RIM world were most active in authorship. It was not surprising when the data found that 31 percent of authors came from the Consulting sector of RIM (see Fig. 5). Consultants have long been an active force in the RIM world and the numbers give support to the fact that those consultants have much to say from their years of effort within the industry. It was also good to see that 26 percent of the authors claimed employment as every day RIM managers, administrators, technicians and analysts. Although, it was thought that that number was somewhat low.

While Teachers (university/college professors/lecturers) made up 12 percent of the authors, the surprise came in the size of the authorship from the employment group identified as Editors and Lawyers. That
group made up a shocking 24 percent of the authorship. (NOTE: At the outset of this work data concerning authors who were lawyers was counted separately. However, while John Montana, a lawyer, was a frequent author, few others self identified from that profession. So, lawyers were grouped with Editors, and as such do influence slightly this particular sector of the authors.)

When the content analysis began it was recognized that Ira Penn, who was the Editor of the Records Management Quarterly (RMQ), offered an editorial comment at the front of every volume issue. At first these comments were not counted as “articles.” However, after reviewing several volumes it became apparent that Penn, who at the time was employed in the RIM sector, was also employed by ARMA as Editor of the RMQ (even if that employment was done on a voluntary basis). While speaking as the ARMA Editor, he was acting on behalf of the larger organization and addressing himself to the RIM community on a different level (or playing field) than he would have been as purely a RIM sector worker. Certainly people would have sat up and listened to Penn no matter who his employer was since he was a strong and well known voice and expert in the records management field. Nevertheless, once recognized that Penn had two employers, since he was often published by ARMA on his own merits, the editorial comments he made had to be considered to be separate articles and counted along with the others. In making this content recognition a startling observation was seen.

After Ira Penn, J. Michael Pemberton took on the mantel of ARMA Editor in October 1998. At that time Pemberton stated that the ARMA publication, i.e. at the time the RMQ, was “a member benefit… and
the field’s primary professional journal.” He went on to explain that while a “magazine format permits a focus on new information, a journal’s chief concern is knowledge transfer.” In the next year, Pemberton, and ARMA, changed the name of the RMQ to the Information Management Journal (IMJ). Pemberton wanted ARMA, and RIM, to have a professional journal, where analytical and theoretical research could be published, as one mechanism to help move RIM towards professionalization. This idea however did not materialize for Pemberton because the “journal” name only lasted for ten years before once again ARMA made an editorial change and switched it to the Information Management (IM) magazine.

Actually, while the IMJ name lasted ten years it was really only a couple of years before ARMA had actually shifted its editorial focus. This happened in 2002 when it began publishing a new section called “Up Front” in each volume issue. The purpose of the new section was “awareness.” Clearly, as Pemberton suggested, this was a step away from “knowledge transfer,” and towards a “focus on new
information.”

In the RMQ years each one of Penn’s editorial comments could have touched on a single key issue of the day, or contained several RIM oriented comments. Penn’s editorial comments, usually covering a single printed page, were each counted as a single article. In contrast, the Up Front sections contained many very different items of news to RIM readers. The Up Front section replaced the “editorial comment” section, and many of the news and/or interest items published in that section were captured from the Internet, from vendors, or for example from government web sites that offered information about new laws, regulations, legal matters under consideration by the courts, as well as changing RIM retention schedules. Any of these items might cover one or more pages of print, but often two or three different items were offered on one page. Regardless of their printed arrangement or size, as with Penn’s single comment counting as one article, here each and every different item had to be counted as a separate
article offered under the name of the ARMA Editor. Because of this, when the Up Front section began (in 2002) a dramatic shift took place (see Fig. 5) as the number of RIM sector authors dropped, and the number of offerings from the ARMA Editors rose significantly. It should be noted that these ARMA Editors were not of a RIM background, practical experience or advanced education, but rather of the world of publishing and writing. The 2002 change was significant also in that rather than one page of
Penn’s comments, there were now as many ten or more pages given to this news type content.

Other findings from this employer data revealed that 1 percent of the authors stated they worked as Vendors, 4 percent said they worked as Information Technology people and 2 percent said they worked as Archivists. These findings seemed quite reasonable and they did not appear in any way to offer any trend either up or down. Nothing really could be discerned from these findings other than what they were.

Employment by Employer Type: As in the analysis section above, biography data was collected concerning who employed the authors (see Fig. 6). While understanding what work the authors claimed they did on a regular basis, it was felt that gaining an understanding of who employed them in their work might be interesting. It was felt that such data might reveal one or more sectors of RIM that were drivers in the industry. At the outset the feeling was that the RIM, and possibly the Consulting sectors should show the greatest involvement. What the data did show provided some validation of this feeling, plus something more.

Figure 6 shows the RIM sector had a total of 26 percent of the activity (12 percent from the Public sector side and 14 percent from the Private sector side). While this 26 percent was thought to be a bit low, when combined with the 32 percent from the Consulting sector the total of 58 percent of authors being actively working in RIM was felt to be a good. It was also felt that a showing of 16 percent of authors being employed by universities/colleges, along with fairly steady activity over the twenty-five year period, showed at least a consistent interest in RIM and that RIM education was at least not waning; although, the last seven years of the data did show a slight downward trend in that interest.

The most interesting finding can be seen in the data concerning ARMA as an employer. Prior to 2002, as discussed above, ARMA had but a single Editor. From 2002 forward that number rose, as the data clearly shows (see Fig. 6). This finding is a reflection of the finding in the previous section concerning the type of work authors stated they did.

Authors by Rank: It was mentioned above that some authors were published more than once. As the data gathering progressed, and several names were appearing repeatedly, it was decided that data specific to the number of occurrences of any author would be recorded. When these occurrences were tabulated a ranking of the most prolific authors published was developed (see Fig. 7). The ranking shows the top twelve published authors, with each having more than seven articles published over the twenty-five year period. While there were several authors who had more than one article published,
and up to as many as five, a gap appeared between the level of five and seven, so seven articles became the lowest cutoff point for this ranking, and it revealed a dozen different authors. Of that dozen, there were no Canadians. All were from the USA, except for one author from the United Kingdom, i.e. Ann Morddel from London, England. (NOTE: Several Canadians, including this author, have had more than one article published by ARMA. For example, Carolyn Minton from Vancouver had several reviews published, and Monique Attinger from Toronto had five articles published. However, no Canadian within the reviewed time span made the leap into the top ranked authors.)

The highest ranking author was Dr. J. Michael Pemberton with 59 articles to his credit. Ira Penn and John Phillips came in a close second with 51 articles, down to Julie Gable in the twelfth rank position with 7 articles. All of the names appearing on this ranking are well known and highly respected names in the RIM industry; all except for one person that of Nicki Swartz. Nicki Swartz was/is an ARMA employed editor, and not an actual expert in the RIM field. Although, twelve RIM oriented articles she wrote were published.

When Nicki Swartz’s name appeared on the list it raised a question. The question concerned what would happen to the rankings if all of the items coming from the 2002 editorial change or Up Front postings were added up under her name? As data concerning this question was reviewed a second name, that of Amy Lanter (also an ARMA employed Editor and a freelance writer) appeared. So, after recalculating the data and counting all the items published by these two persons, including those from the Up Front sections, an adjusted ranking of the authors was produced (see Fig. 8). In this adjusted situation, while Pemberton manages to remain in the top spot, Nicki Swartz bumps Ira Penn and John Phillips from the number two position, and Julie Gable is bumped off the list and replaced by Amy Lanter as the tenth ranked author.

Nicki Swartz’s and that of Amy Lanter’s rankings on these lists came as a result of their editorial and writing work, and not because of their years of experience doing RIM work in any day-to-day real world records management situation. This finding was seen as shocking, especially when linked with the fact that the general size of the volume publication issues had reduced by almost half from an average size of 73.4 pages per issue in the RMQ years, to an average of 48.5 pages per issue in the IM magazine years. These writers with their offerings were taking up a dramatic portion of each publication. Was this a sign that workers in RIM were losing interest in publishing and therefore not doing any work to submit any articles? Or, was something else happening?

Article Content: Content analysis is an approach for breaking-down, or synthesizing, something into smaller buckets or areas. In the case of the published content of the articles five such buckets areas were considered as reasonably fundamental or characteristic of RIM. These five areas were: articles concerning Information Technology, RIM Programs, Professional Development, Legal and Standards. Each is addressed separately below, but overall the data showed that 84 percent of the articles contained information of a professional development nature (see Fig. 9). In and of itself this finding might give some support to Pemberton’s desire that ARMA be educating its membership.
For an article to be counted as falling under the Information Technology (IT) area, its content must have been about the installation, configuration, or customization of a software or hardware for implementation use within the RIM industry. It needed to be technical. An article that spoke about comparing software functionality for its purchase selection, or about the best use of a piece of software for certain situations, organizations, or records/information was considered to be an article geared towards educating readers and therefore of a professional development nature. For an article to be IT, it truly had to be about the computer science of that technology. While many articles had titles or short introductions that suggested their content was of an IT nature, in fact, on closer examination of the article content the data revealed that only 4 percent of all the articles fell into the IT area. At first glance this finding appeared low; however, since the focus of the ARMA publication has been RIM and not computer science, the finding is felt to be not unreasonable.

Article content of a RIM Program nature had to be about actually operating some component of RIM in a day-to-day sense. Such an article might speak about operating an offsite storage facility. Not the establishment of that offsite facility, but its operation. An article concerning the establishment of such a site would fall under educating readers and therefore professional development. The line between the two may seem thin, however, the idea here was to gain an understanding of the underlying purpose of any article and/or its real focus, regardless of what its title and short intro blurb might suggest. Thus, in this area of RIM Program, 5 percent of the articles were counted.

Some 6 percent of the articles were considered to be within the Legal area. Here, an article must have been specifically about a new law or regulation, and not about the implications of that law in the RIM world. Again, the line here is thin, however, where an author wrote about how best to apply a law, or how to gain RIM program recognition because of the presence of a particular law, those articles were educating readers and fell under the area of Professional Development. So, in reality, a finding of 6 percent might be actually high, given the focus of the ARMA offerings is not a legal review or debate.

For an article to fall within the Standards area it had to be solely about a standard, and not about how a standard could be used to audit a RIM operation/program. The narrow line presents itself again, but if an article spoke about how a standard was developed or used to the benefit of an organization, it was teaching, and thus fell under the Professional Development area. Since ARMA as an organization has long stated its belief that Standards are important and valuable tools to the RIM industry, and because ARMA has a “standards development” committee, in one form or another, and has actively pursued, commented on and developed standards and technical papers dealing with RIM related subjects, the
author felt that investigating this particular area of the published content might be interesting. For many years the author was an active participant on that ARMA standards development committee, and has been published by ARMA on the subject, (i.e. Bolton, 2011). So, at the outset the feeling was the data should show a clear level of representation for this particular area. However, the data showed no such interest. Article content about standards only represented 1 percent of the total.

Thus, as mentioned above, the data showed that a whopping 84 percent of the article content was focused towards Professional Development. Given that the nature of a membership organization such as ARMA includes some element in developing its members and in educating them, this finding might not
be unusual. Here it should be mentioned that bias, by the author in reading article content and in determining an area for that content could have influenced the overall numbers somewhat. This factor could be a topic of future research, as could another more refined research approach. Nevertheless, for the purpose of this work, the finding is what it is.

Advertising Content: To look at the published content concerning advertizing two different views of the content data were gathered. The first of these was data characteristic of subject, i.e. Equipment (i.e., file folders, shelving, shredders, and storage boxes); Micrographics (i.e., film, cameras, and readers); IT (i.e., RIM software and hardware); Services (i.e., storage, auditing, shredding, and disaster recovery); Events/Awards (i.e., conferences, postings and acknowledgements); Books/Training; Standards; Job Ads; and Other (see Fig. 10). At the beginning of the research a subject area for Legal was also considered; however, no data was found to support this particular content area. The second view concerned data about who was behind or sponsoring and paying for the advertizing content. To understand this content view better, seven possible groups were developed; i.e. ARMA, Vendors, AIIM (Association of Information and Image Management), Consultants, Schools (universities/colleges), ICRM, and that of Other (see Fig. 11). It was necessary to consider ARMA as a sponsor for some of the content because it advertised its own conferences, printed materials for purchase, training, and at least during the RMQ years ARMA often published announcements concerning organization Board of Directors membership, awards, and other member recognition news.

Analysis of the data (see Fig. 11) found that 71 percent of the advertising content was sponsored by Vendors. This was certainly not a shocking discovery. The author had no expectations concerning this finding, but generally believed at the outset that the data would show something close to this figure, if not higher. Consultants sponsored 4 percent of the advertising, which seemed a bit low, but considering that most consultants probably operated at the local level, paying to advertise in an international level publication was probably not within practical viability. One interesting finding was that ARMA showed that it paid for 21 percent of the content. As well, the data showed that ARMA’s sponsorship of advertising was trending upwards in the last years of the investigation. Given that the size of the ARMA publication offering was decreasing in its average number of pages per issue, this increase in advertising by ARMA could be evidence of the general state of the North American economy (i.e. fewer Vendors purchasing advertising), or it could be that sponsors were finding other ways to distribute their advertising (i.e., Internet) and that ARMA was left to fill its own publication pages. Regardless, these two findings appear to point to a convergence, and do not bode well for the future. This is one particular observation that should be watched.

A somewhat disturbing finding is the low level of advertising sponsorship by the ICRM, that of a meager 1.5 percent. Since a purpose of this work was to examine for evidence of professionalization, it was thought at the outset that, as Webster suggested, availability and interest in advanced RIM education was going to trend upwards. Acquiring the CRM designation from the ICRM was one of the key possible avenues for gaining this advanced education. So, if RIM had been moving towards professionalization, certainly more and more RIM workers would have obtained their CRM, and the data should have shown
a strong sponsorship from the ICRM because its members were growing, interest in the CRM designation was growing and more money would have been available to advertise the benefits of obtaining the professional designation. However, the data shows many years with no ICRM advertising at all and overall a remarkably weak level of sponsorship. This finding probably would not have made Ira Penn very happy. (NOTE: RIM is currently experiencing a push in the area of “information governance.” That push includes training and certification somewhat similar to that of the CRM. An examination conducted in the future might be interesting to see how this push fairs. This author will venture the
opinion that, in the long run, it will fair no better than the CRM. Only time and someone else’s efforts will prove that the author was wrong.)

Data concerning the breakdown of advertising content by subject areas showed various levels of activity; but nothing was seen as remarkable (see Fig. 10). The largest subject area found was that of IT at 36 percent. But this was certainly not remarkable given the huge explosion of RIM related software available in the marketplace, not to mention the impact of managing “e-records.” Service oriented advertising showed 17 percent, and RIM Equipment showed 14 percent. One particular observation is in the area of advertising for books and Training. Overall this area showed a 15 percent finding. Through, most of the investigation years, such advertising was for books with some limited advertising for educational seminars/sessions. However, a significant jump in the volume of advertising appears in the later years. That advertising mostly sponsored by ARMA, concerned training in the topic of governance. While no clear conclusion can be made from this observation, it was interesting, and is one that could be watched closely to see what happens in the future. In other words, will training in governance replace the CRM, will such training actually make the difference in moving RIM towards professionalization, or will interest in governance fade over time to be replaced by something else?

Citations and Information Impact: According to Hoang, Kaur and Menczer (2010), “data from citation analysis can be used to determine the popularity and impact of specific articles, as well as gauge the
importance of an author’s work.” When an author quotes the words of another, or uses that person’s thoughts to explain and/or argue their own point or position, that author must cite the original source otherwise they commit plagiarism. Depending on the situation and rules of that situation, at a minimum the citing of another’s work should include reference to that person’s name, maybe the year in which they made their original comment, and typically where that comment was found, i.e. where it was
published. In making such a reference to the other’s work, an author gives some measure of weight to that other person. This does not necessarily mean that the one author has to agree with the other author, only that they recognize that the other person did some work in a particular area of study/research, and that good, bad or indifferent, that work is a worthy of recognition at for its existence. The more times a particular person’s words are cited, the greater that person’s recognition, if not to the whole world in general, at least within some community of interest such as RIM. When a particular publication, a journal or newspaper is cited often, it too gains in recognition. To measure and rank publications by their citations, an “information impact factor” for those publications can be calculated. Thus, for any year, the higher the number of citations that refer to a particular publication,
the Journal of Medicine, the higher it is felt the information impact of that publication is. In other
words, a publication with a high information impact is viewed as having greater significance, and its content is viewed as being read more and having greater weight and influence.
It should be noted that there is an organization, i.e., Thomson Reuters (Social Science Citation Index), that conducts the work of gathering citation data from publications and counting that data to derive information impact factors for those publications and for publishing their findings. Not all journals, magazines and published materials are examined by the Thomson Reuters organization. While the ARMA publication offerings are not identified amongst those publications that are measured, of curious interest though is the fact that from Nigeria, the “African Journal of Library Archives and Information Science” is indexed by Thomson Reuters.

To gain an understanding of the information impact of the ARMA publication offering, i.e. the three publications under review, all the citations identified in the published articles were counted. This counting however met a problem when in 2009 ARMA made an editorial decision to stop printing the citations in the publication issues, and rather only offered them in an electronic version of the articles. Each electronic article had to be called up individually which added greatly to the time necessary to gather the data. In examining a few of these electronic articles the number of citations being offered did not appear to raise or lower in any way different from citations printed on paper. So, given the extra time and effort involved to gather the electronic data, and because no apparent difference was observable, the author stopped gathering citation data at the end of the 2008 volume year.

In gathering the citation information the data was separated into different source type buckets (see Fig. 12). These buckets were; i.e., Journals, Books, Conference Proceedings, Vendor white papers and etc., Government publications, Standards, and offerings found on the Internet websites/pages and blogs. (See Also: Nelson Edewor, 2013). From this data it was not surprising to find that 52 percent of the citations came from Journals, while 23 percent came from Books, and another 10 percent came from Government materials. This spread of citation sources seemed to be a reasonable offering. The author also did not find it too unreasonable that 4 percent came from Conferences, 3 percent from Vendors, 3 percent from Standards, and 5 percent from the Internet. Although, use of the Internet as a source does raise other issues such as the demise of printed materials, information validity, and growing reliance on software and hardware to display, locate and retrieve information.

Now we come to the final finding of this work. The data revealed that the ARMA publication offerings had an overall information impact of 9.39 percent. This was calculated in the following manner:

Total number of citations = 2075

Total number of citations identifying an ARMA publication as its source = 195

Citation Information Impact Factor (195 / 2075 x 100) = 9.39 %

While this information impact factor might not seem low, it is. It marks and gives some evidence to what readers are really feeling about the content they are offered in their membership publication. Certainly, since the ARMA publication is provided to every member of the organization, and on the surface one
could view it as something of worth within that community, would it be unreasonable to think that the people in that organization would quote from it, often? Remember, Pemberton told us that the focus of the ARMA publications was professional and to offer education. So, if the ARMA Editors have viewed its articles as worthy of the audience they are working for, is that audience not to view those articles as worthy? Clearly, since in twenty-one years of published articles only 195 out of 2075 citations came from an ARMA source (i.e., the three publications), that audience does not appear to view the information materials as being very worthy, or else wouldn’t they would have quoted them more often? This observation may in fact support the finding of the AIEF survey which reported that those surveyed did not believe that the articles published in a reviewed journal would be of value to them personally.
Was this feeling due to the fact that they were already viewing their ARMA publication as not having much value to them? Did the members receiving the publications view the contents as mediocre and/or “dumbed down” to the point of being useless? These are somewhat harsh and yet very interesting questions and they are questions which this author hopes are looked at and addressed through other future efforts.

 

Conclusion:

Within the RIM community, at least in the United States and Canada, a belief appeared to exist, at least to some extent that a state of professionalism existed for the field of RIM. Not that that state had been verified in any manner, or formally named by any group or person. Rather, its existence manifested itself via common use of the word “professional.” Possibly its existence was due because there was no challenge to its existence. Possibly it was even due to a general desire by the RIM community to hold the belief. The research outlined in this paper took the position that, if such a state of professionalism existed, some verifiable proof should be available to openly discover and state.

In this work some basic findings were revealed. These included:

  • Authorship was shown to be 88% American, with Canadian authorship at 6%. This mark barely beating out that of the UK and Europe at 4%.
  • Authorship was 63% male. (Given an ARMA membership that is probably in reality closer to a split of 75/25 favoring the female side, this fact clearly identifies how weak the female voice seems to be.)
  • Consultants made up 31% of the authors.
  • At 24%, authors who are/were ARMA Editors (with the addition of a few lawyers) almost outstripped actual RIM workers at 26% of the authors.
  • The average size of the ARMA publication shrank over time from:
    • Records Management Quarterly = 73.4 pages per issue
    • Information Management Journal = 70.4 pages per issue
    • Information Management Magazine = 48.5 pages per issue
  • The credentials information offered by authors revealed no growth in CRMs or PhDs.
  • Data showed no trend in the level of advanced analytical research activity. And,
  • A “freelance writer” was among the top ranked authors.

So, in response to the outstanding questions mentioned above, did RIM achieve the rank of a true profession? Was our unspoken belief in reality a fact? Let’s review the evidence. When it comes to the development, understanding and use of “systematic theory” RIM has the life-cycle concept, but beyond this little if anything else exists. One theory does not provide much weight to the goal of gaining true professionalization.

When it comes to the idea of “community sanction,” ask yourself if your CEO could describe the difference between the work of a RIM person and that of an IT person, Librarian or Archivist. Are there community boundaries on the field of RIM that are definable and which allow for a demarcation between US and THEM? It would be, except in the unusual case, probably unlikely that such a CEO could adequately make such a differentiation or in describing any line of demarcation between the roles. Is this so important? Possibly not, except that RIM workers often carry a heavy load of responsibility in their respective organizations and would like, probably, to be recognized, and adequately compensated for that load. Even potentially to the extent of being covered by professional insurance if the need arose. Therefore, since RIM work and workers are not viewed clearly as different from that of others, of being distinct, the answer is no! RIM does not command any level of community sanction sufficient to be considered as a true profession.

Within the CRM community a Code of Ethics exists to be adhered to by its certified members. However, no such code of ethics exists for the wider RIM community through ARMA or any other RIM organization. This lack, plus the absence of any mechanism, policies and procedures to really monitor actions and penalize for inappropriate code violations means there is no backbone to the RIM body. A lawyer being disbarred loses not only his career, but his livelihood, good name and reputation. What happens to a RIM consultant (or any RIM person) that takes the wrong path? The response is nothing, or very little of consequence. Oh, if caught they might lose their job, or get a reprimand, but in the sense of real hard consequences, little ability for impact exists. Thus, RIM does not really meet this requirement for being a true profession.

What about a “culture?” Does RIM have a recognizable culture? Again, the answer is no, or at best it is weak. A language that RIM workers recognize and use on a daily basis and in their literature does exist; however, that language isn’t truly RIM’s alone. In fact it is shared with several other groups, particularly that of IT as RIM wanders farther down the “e-record” road. As for RIM having a different and definable approach and an attitude all its own, the answer is again no, or at least to be fair, if they do exist, let someone identify and prove that existence through a separate effort of research.

Ultimately, has this work found that RIM gained professionalization, or as Pemberton mentioned has it found us for far too many years simply regurgitating and churning on hearsay, conjecture, anecdote and propaganda? The content articles examined appear to lean on practical expertise and observation and not on verifiable proof; which doesn’t make them wrong it only tends to reveal them, in their repeated nature, as weak. Beyond the life-cycle theory little new theoretical work appears to have been accomplished, or at least published in the ARMA offering; an offering that has experienced a shift in its
authorship from those that actually do RIM work, to those who simply talk about RIM work. And, with the low level of citation given to the work presented in that ARMA offering, the impact of that information on its readers appears remarkably weak. Which finding seems to coincide with that of the AIEF survey that found responders felt a reviewed journal would offer them little of value. Was that feeling of nothing due in fact because people were just tired of the rhetoric? Who knows? Therefore, no evidence was found via this work to verify any initiative, or even effort, to really move RIM towards any goal of becoming a true profession. One can only assume that regardless of those in our past that stated we had a chance to reach the goal of becoming a true profession, they were either wrong, or that somewhere along the line interest in the whole idea was lost or abandoned. Maybe, rather than lost, it was realized to be a myth, a dream, and therefore, not given any credence. It may have been easier to believe in the myth, than to fight for the reality.

As for the ARMA organization and its role, well nothing really can be said. The data supports no evidence, trend or conclusion either plus or minus. Economics have clearly played a part as is obvious in publication issue size, style and publication mechanism changes, i.e., printed to electronic. There were also editorial shifts. When Pemberton took the reins as Editor, there was an attempt to shift the publication towards a journal with the aim of moving the content to something more scholarly and analytical. However, within three years that dream was dismissed. Did ARMA feel that its readers and membership weren’t worthy of a journal? Did they realize that the contents of a journal would be too high level and possibly beyond the reading and comprehension of its average reader? Again, who knows? There was no discernible change that could be identified as the point where ARMA was making an effort to grasp the brass ring of professionalization or in killing any existing efforts for truly reaching professionalization.

Ira Penn (1993) wrote that RIM was lacking in leadership and needed some major philosophical changes. He recognized that the low number of CRMs in the field as being pathetic and he went on to say that the fault shouldn’t be blamed on others as, “the downtrodden must take some responsibility for their own plight.” So, in the end, what is left to say? Well, to borrow an old expression, “the future is what you make of it.” Therefore, if RIM wants to be a true profession it has a lot of work to accomplish to get there. Or, at a minimum, there could at least be work done to help us RIM folks better understand why we can’t reach the goal, and how we can better prepare and conduct ourselves in the world we have to survive in. Such work could be undertaken at the PhD thesis level or purely as research work conducted by RIM person’s with an interest in discovery. Clearly, more could be heard both from the female and Canadian voices. Through the effort and publication of suck work RIM’s literature base increases, our recognition increases, our community is better defined, and our RIM world is better positioned for us today and for the future.

References:

  1. Pemberton, J. Michael and Pendergraft, Lee O. “Toward c Code of Ethics: Social Relevance and the Professionalism of Records Managers.” Records Management Quarterly 24(2), (1990): 3-11, 15.
  2. Pemberton, J. Michael. “A Profession Without Professional Literature.” Records Management Quarterly 27(3), (1993): 52-55.
  3. Force, Donald C. and Shaffer, Elizabeth. “Records Management and Peer-Reviewed Journals: An Assessment.” ARMA International Education Fund. 2013.
  4. Pennix, Gail B and Fischer, Marti. “Cumulative Index to the RMQ 1967-1987.” Records Management Quarterly 22 (1), (1988): 27-122.
  5. Greenwood, Ernest. (1966) “The Elements of Professionalization” in Vollmer, H.M. and Mills, D.L. (Eds), Professionalization, Prentice-Hall, Englewood Cliffs, N.J.
  6. Millerson, G. (1964) “The Qualifying Associations. A Study in Professionalization. London, Routledge and Kegan Paul.
  7. Webster, Berenika M. “Records Management: From Profession to Scholarly Discipline.” The Information Management Journal 33(4), (1999): 20-30.
  8. Pemberton, J. Michael. “Records Management: Planet in an Information Solar System.” (paper presented at the ARMA International 35th Annual Conference, San Francisco, November, 1990).
  9. Bolton, John. “A 25 Year Content Analysis of RIM Information Impact.” (paper presented at the ARMA Canada Conference, Calgary, May 25-27, 2015).
  10. Edewor, Nelson. “An Analysis of a Nigerian Library and Information Science Journal: A Bibliometric Analysis.” (2013) Library Philosophy and Practice (e-journal). University of Nebraska, Lincoln. October, Paper 1004. http://digitalcommons.unl.edu/libphilprac/1004.
  11. Krippendorff, Klaus. “Content Analysis.” (1989) http://repository.upenn.edu/asc_papers/226
  12. Elo, Satu and Kyngas, Helvi. “The Qualitative Content Analysis Process.” Journal of Advanced Nursing 62(1), (2008): 107-115.
  13. Penn, Ira A. “Records Management: Still Hazy After All These Years.” Records Management Quarterly 27(1), (1993): 3-8, 20.
  14. Brumm, Eugenia K. “Graduate Education in Records Management: The University of Texas at Austin Model.” Journal of Education for Library and Information Science 33(4), (1992): 333-337.
  15. Walters, Tyler O. “Rediscovering the Theoretical Base of Records Management and Its Implications for Graduate Education: Searching for the New School of Information Studies.” Journal of Education for Library and Information Science 36(2), (1995): 139-154.
  16. Pemberton, J. Michael. “From the Editor: RMQ: The Next Generation.” Records Management Quarterly 32(4), (1998): 2.
  17. Bolton, John M. “RIM Fundamentals: Standards: Providing a Framework for RIM.” Information Management Magazine 45(3), (2011) 30-35, 51.
  18. Hoang, D; Kaur, J. and Menczer, F. “Crowdsourcing Scholarly Data.” (paper presented at the Websci. 10: Extending the Frontiers of Society On Line. Raleigh, NC. April 26-27, 2010).

 

FIGURE 1: Data Gathering Form (Blank)

FIGURE 2: Authors by Gender

FIGURE 3: Authors by Country

FIGURE 4: Authors with CRMs and PhDs

FIGURE 5: Authors by Their Type of Work

FIGURE 6: Authors by Employer Type

FIGURE 7: Authors by Rank

FIGURE 8: Authors by Rank – Adjusted Ranking

FIGURE 9: Article Content

FIGURE 10: Advertising Content

FIGURE 11: Advertising by Sponsor Type

FIGURE 12: Citation Data and Information Impact

Records and Information Management Issues in First Nations Governments

February 26, 2016

 Abstract: This paper focusses on the challenges to establishing effective records and information management programs in First Nations government bodies, primarily within Indian Band or independent First Nations government bodies. While they share similarities with organizations of all types, First Nations government bodies also face some unique barriers because of the span of responsibilities they oversee, the diverse ways in which they are funded, and the chronic shortage of manpower or capacity within their organizations to perform their work. Records and information management professionals have an opportunity to train and provide leadership for these developing government bodies.

Introduction

At a recent symposium on First Nations Records and Information Management1, Grand Chief Ed John (Akile Ch’oh) underlined the importance of records management for First Nations governments. He described the key role that a staff review of archived documents played in supporting the host, the Musqueam First Nation, to establish their right to claim market rent for land that had been leased on their behalf at a disadvantageously low rate by the Government of Canada. 2 This public statement of support was followed the next day by his salutary remarks at the meeting of the First Nations Summit, addressing the chiefs and officials, and congratulating the successful symposium for contributing value to the capacity of First Nations governments. Grand Chief John’s example and his recognition of the value of records emphasize the importance that the senior management levels in First Nations governments are placing on an effective records management program.

However, these government bodies also face unique challenges to implementing effective records and information management programs. At a Joint Gathering in October, 2015, in a session on information management, several senior managers in the audience confessed that their biggest problems in their offices were caused by lost information, and the daily frustrations faced by staff in the constant hunt for records.

 

Types of First Nations Governments

In order to review the records issues, it is important to understand the different types of First Nations governments that exist in Canada.

The term “First Nation” is now used commonly to replace “Indian Band”, and is the term that will be used in this paper to describe the various types of aboriginal government groups that are present in Canada. Currently, there are 617 First Nation communities, which represent more than 50 nations or cultural groups and 50 Aboriginal languages.3

Across Canada, as First Nations governments establish their independence from direct federal government management, or pursue pathways to strengthen their claims, these government bodies have a unique opportunity to build records and information management systems based on information governance principles, policies and best practices that the records and information management profession has established. In many cases, particularly in the western part of the country, the First Nations are establishing a new order of government, and can build systems from inception that incorporate the necessary elements. However, these organizations also face unique records and information management challenges as they move forward to establish their governance models.

Given the variety and complexity of First Nations government organizations today, one can easily confuse the subtleties and differences among these various groups. They are not all the same in their composition and governance.

An Indian Band is

“a body of Indians for whose collective use and benefit lands have been set apart or money is held by the Crown, or declared to be a band for the purposes of the Indian Act. Each band has its own governing band council, usually consisting of one chief and several councillors. Community members choose the chief and councillors by election, or sometimes through custom. The members of a band generally share common values, traditions and practices rooted in their ancestral heritage. Today, many bands prefer to be known as First Nations.”4

The First Nations, in this category are closely tied to the federal government for their core operational funding, although they may also have own source revenue.

Self-governing First Nations are

“governments designed, established and administered by Aboriginal peoples under the Canadian Constitution through a process of negotiation with Canada and, where applicable, the provincial government”5.

The First Nations in this category have negotiated a settlement for their own territory and rights. This includes obtaining payment for their land, and developing their own revenue sources. Depending on their settlement, they have some or complete independence from the federal government for their funding. In addition, they have their own governance models, election processes, membership admission and management, and pass their own laws, such as Freedom of Information and Protection of Privacy Laws.

Tribal councils are regional affiliations of First Nations, frequently united by language or adjacent land areas. Within these collectives, they may agree to manage resources or undertake activities jointly rather than as separate entities. As an example, the Naut’sa mawt Tribal Council (NmTC) is

“a non-profit society that provides advisory services to its eleven member First Nations in five core delivery areas: economic development, financial management, community planning, technical services, and governance.”6

Understanding the distinctions between these groups helps to understand how they are funded, and consequently, why they face the records management issues and challenges that are identified in this paper.

 

How First Nations Governments are Funded

In 1867, under the terms of the British North America Act, Indians and lands reserved for Indians became the responsibility of the Federal Government under the terms of the Indian Act. Most programs and services that non-native citizens receive from a variety of other governments and service providers are provided to the First Nations from Federal Government programs. The support to operate First Nations government services and programs is a combination of annual contribution agreements as well as the communities’ own funds, or “own source revenue”.7 Contribution agreements do not cover programs such as records and information management.

The adequacy of the funding has been the subject of much debate and is not the point of this paper. However, the consequence of this blend of funding means that in addition to the annual contribution, most First Nations are also seeking funding from other sources, including making arrangements for specific funds or a memoranda of understanding with external organizations in order to provide government services to their communities. This search for funding takes staff time away from daily operations and creates uncertainty about continuity of programs.

A search for “program funding guidelines 2015-2016” on the Indigenous and Northern Affairs Canada (INAC) website8 finds 33 search results, covering a wide number of programs including “community opportunity readiness program”, “basic organizational capacity contribution program fund”, “elementary/secondary school fund”, “skills/jobs training program” and others. This search did not cover other federal agencies such as Health Canada to review what funding programs might be offered for First nations.

Among the possible funding sources from INAC, the Professional and Institutional Development Program (P&ID) is currently the program where First Nations Communities are applying for funding to support records and information management programs. The program objective is to

“To develop the capacity of First Nations and Inuit communities to perform core functions of government, by funding governance-related projects at the community and institutional levels.”9

There are ten core functions of government listed as eligible for support in the P&ID program, including information management/technology. Consequently, First Nations communities can apply for funding for support of projects, activities and expenditures, such as professional/consulting services, training, and professional development or travel to courses. Currently, this fund is being accessed for records management program development, information management policy development and a variety of consulting services for records and information activities in communities. If successful, recipient agencies must submit a report on activities and expenditures. Language in the funding agreement permits INAC to share the results of these activities with other First Nations.

However, organizations are cautioned that the P&ID program is “project funding only” and does not provide core operational funding to the community. How does the community fund records and information management on an ongoing basis? Hopefully it is provided from the contribution agreement for core funding and sustained long term. More typically, the records and information management programs become projects, and operate in bursts of activity only so long as the project funding is available.

 

Government program responsibilities

When comparing the functions of a First Nation government to another type of government body, a functional analysis reveals that the First Nation has a very wide span of responsibilities, as contrasted with a city or town, a province, or even a federal government ministry. Typically the First Nation government manages the following programs and services:

  • land administration and regulation, including specific claims for land, referrals for other organizations for use of land, land use planning and comprehensive community planning;
  • housing (including financing, construction and development, maintenance, tenants and rent)
  • community and social development, including child development, welfare, social services, elder care and support;
  • education and lifelong learning, including day care, early childhood development and support, school development and funding, or relationships with local school authorities, financial support for students in post-secondary education;
  • engineering, public works and infrastructure, includes streets, roads, water supply and maintenance, sewerage and waste management, environmental management;
  • fisheries and wildlife management, including licensing, catch management and sales, fisheries fleet management, aquaculture, hunting permits, mushroom licensing, range management;
  • forestry, including logging, silviculture and tree farm management, wood sales;
  • health care services including clinics, patient transportation and support, liaison with other health agencies, specialized health programs, preventive programs, addiction programs;
  • protective services including law enforcement, fire protection, emergency services;
  • culture and history, including archaeology, language support, specific cultural programs including collecting oral histories;
  • economic development, including establishing corporations for such activities as aboriginal tourism, store operation, development of business partnerships for resource development, etc.10

There are also the common administrative functions that are typical to all organizations. However within these functional areas, such as within the legal matters, there may be specific land claims, a treaty negotiation or litigation. They may also operate their own membership or Indian registry functions. In addition, there may be partnerships with the tribal councils for joint activities on resource use, environmental matters or economic development.

In other types of governments, these functions are performed by separate agencies, such as a municipality, a health authority, a school district, a social services branch of a provincial government, housing agency and separate police, fire or emergency services organization. Moreover, each of these agencies have the services of records and information managers, information technology staff, archivists and museum staff, and appropriate facilities to store and manage the records accrued.

The diverse government functions generate volumes of records. These records relate to current business, but may also include historic documents, maps, language records, oral history records, cultural property and materials that belong in a cultural or historic centre and archives or museum.  The program records are often linked to externally imposed record keeping requirements. How can a First Nation support records and information management in view of these broad and often conflicting functions that must be supported? Often, in light of the competing interests from these portfolios, it is difficult to have a program without specific funding. However, the funding may be limited to project rather than program funding.

 

Records Management Issues and Challenges

 “The usual suspects”

Like other types of organizations, First Nations governments experience some typical challenges in establishing records and information management programs. These include:

 

  • The need for leadership and senior management support, without which the programs get no financial or organization support and buy in. In most organizations, records managers must establish the value proposition in order for the program to obtain a mandate, funding and support;
  • The growth of organization functions and the competition for funding of programs. Growth of organizations often means that funding must be directed to the programs directly associated with the key business operations. This competition, especially when there may be uncertainty about the value of records management, may lead to the view that the program is seen as not core to the organization, or “administrative”, consequently underfunded, or funded with what remains, often little or none;
  • The diverse mix of materials to manage. Personnel use a blend of technologies to create information, and place equal importance on the historic and cultural records as well as on the current, frequently digital formats. Consequently, the records managers are required to assert control and manage a diverse array of information

At the Records and Information Management Symposium cited earlier, participants were asked identify their records management problems. Of the choices provided, the greatest number of respondents (30 of 65) answered that “all of the above” were their problems:

  • no funding,
  • no political will,
  • no buy in from management and staff,
  • not sure where records are stored, and
  • no designated Records Manager to assign

These types of problems can be present in any type of organization, and are familiar to records managers in many settings.

Issues unique to First Nations bodies

In addition to the problems cited, the First Nations governments also experience some unique challenges with records and information management programs.

 
Record Keeping Terms within Agreements

Frequently there are terms and conditions within agreements for services with which First Nation must comply to maintain the relationship.

Most often, within many of the programs that First Nations provide, there is a provision for submitting reports and maintaining records for audit purposes. Beyond reporting, there are often much more specific requirements.

The provision of health services is an example with defined record keeping requirements. As previously described, the health portfolio is a key area present in most First Nations. In 2013, the First Nations Health Authority in British Columbia assumed the responsibilities and programs formerly provided to British Columbia First Nations through Health Canada. Interim and year-end reports are required for each of the programs for which the community receives contribution funding.11

Data management is also one of the key components of BC First Nations health governance.

“One of the action items in the Transformative Change Accord: Tripartite First Nations Health Plan is to improve the collection, use and sharing of First Nations health data in order to:

 

  • Increase First Nations involvement in decision-making concerning their data and services and develop the capacity of First Nations in the area of health information
  • Facilitate access to accurate, timely, reliable health information for First Nations to inform decision-making and use health data to improve the quality and effectiveness of health programming
  • Facilitate and support principles of First Nations health information governance. “First Nations Health Information Governance” is a component of First Nations Health Governance and refers to a structure, process and protocols by which First Nations in BC have access to First Nations data and are influentially involved in decision-making regarding the culturally appropriate and respectful collection, use, disclosure and stewardship of that information in recognition of the principle that such information is integral to First Nations policy, funding and health outcomes. “12

Regional meetings throughout the province from February onward will provide further training on such data governance subjects as research and data anonymity.

The Indian Register is another function that is managed in many First Nations government offices, although it may also be managed in a regional office of Indigenous and Northern Affairs Canada. The Register

“is the official record identifying all Registered Indians in Canada. Registered Indians are people who are registered with the federal government as Indians, according to the terms of the Indian Act. Registered Indians are also known as Status Indians. Status Indians have certain rights and benefits that are not available to Non-Status Indians or Métis people. These may include on- reserve housing benefits, education and exemption from federal, provincial and territorial taxes in specific situations.

The Indian Register contains the names of all Status Indians. It also has information such as dates of birth, death, marriage and divorce, as well as records of persons transferring from one band (or First Nation community) to another.”13

A staff member who is trained to manage the Indian Register must follow the protocols that are included in the agreement for the individual community to manage the register. These protocols will include establishing processes to maintain the privacy and security of the records and information that are required. The types of information listed above include the most sensitive personal information, and custody and use of such information requires an understanding of the permitted uses and disclosure of such information.

For self- governing First Nations, the equivalent function is expanded to include their Membership Registry, which is the list of members of their nation, and their processes for admitting members into their community. The essential nature of this information requires that the staff members secure and protect the records against all possible disasters. This function is the equivalent of a Vital Statistics unit at the provincial level.

A third example of service provision by agreement in many communities is the Child and Family Services portfolio, by which the First Nation operates child welfare services, including child protection and child custody. The service is administered through, and provided by, the relevant provincial ministry, and the First Nations social workers and other staff will operate the program according to the standards established by the ministry. Particular requirements include the templates for documentation required, and the reports and audit information to be submitted. In addition, there are compliance requirements for privacy provisions necessary to protect the child and family identity. The information is maintained according to the provisions of the provincial protection of privacy legislation.

Personnel who administer the terms of these individual agreements are imbued with a sense of responsibility for the terms of agreement. This sense of individual responsibility often leads to siloes of activity, with staff in the portfolios unable to share their information with their peers in other departments within the First Nations government office.

 Limited capacity/manpower

The term “capacity” is used to describe the capability of the organization, usually in terms of staff numbers and skill sets. Despite the wide range of functions for which they are responsible, the typical First Nation office does not have many employees. Of the 617 communities, 60 percent have fewer than 500 members.

As an example, a very small First Nation, with 86 members, has 10 staff in its office. With this small number, staff members assume responsibility for several portfolios, in order to provide the range of services that the membership is entitled to receive. In addition, this First Nation employs several staff on contract to perform core functions of information technology management, housing management and policy development. The person performing the policy development role is supervising a contract for records management services. Information technology support is provided by an external service provider, who is responsible for the network, applications, hardware and software, as well as help support.

In a great number of communities, working for the First Nation government is the key source of employment, and governments see it as their responsibility to ensure as many members as possible have work. Whether employees have the necessary skills can be an issue. Hence, in most communities, there is training on the job, or attendance at courses, as a way to develop employees’ skills to perform their jobs.

In British Columbia, as communities established their own treaties, the issue of building capacity for their government bodies was addressed by the chiefs. The First Nations Public Service Secretariat (FNPSS) was established in 2008, specifically to address the development of skilled staff in the areas of financial administration, human resources development, policy development and information management. As part of the FNPSS information management capacity initiatives, consultants and partners developed the Information Management Toolkit, and provided training through nineteen 2-day Information Management Boot Camps hosted by individual First Nations and tribal councils.

Until their funding was removed in 2014 by the former Federal government administration, the FNPSS built partnerships with agencies and organizations that provided training for their membership. One of the key partnerships created by the FHNPSS was with ARMA Canada and the ARMA Vancouver Chapter.

The ARMA region and chapter were signatories to a Memorandum of Understanding14 with the FNPSS and the First Nations Summit in 2010 to provide training and capacity building opportunities for First Nations and First Nations organizations in British Columbia. The specific areas of work included provision of training to First Nations record keeping staff, professional development opportunities, research and development and mentorship for individuals. Since the signing, ARMA Canada has offered First Nations Information Management Boot camps as preconference sessions at the annual conference, has introduced topics of specific interest or held a complete track of topics relevant to First Nations organizations. The ARMA Vancouver Chapter was a co-host to the 2016 FN Records and Information Management symposium, and is providing financial support to an upcoming one-day symposium in Whitehorse. Attendance at the Vancouver symposium was expected to be 75 delegates, but the final number of registrants was 98, with a total attendance of 134, when speakers and sponsors were included. 

Who is the records and information manager?

With consideration of the factors described, from the extensive number of portfolios to manage, the uncertain and varied sources of funding, and the limited number of staff, who is performing records and information management duties in First Nations government bodies?

A survey of First Nations governments in British Columbia and the Yukon, conducted by the First Nations Summit and ARMA Vancouver Chapter in November, 2015, 15 determined that only 16% of the agencies surveyed had one person managing records, as contrasted with 80% where each department or individual managed their own records (4% didn’t know). From 101 responses, only 8 indicated that there is an individual with “records” (manager or clerk) in their job title, and only 3 indicated that there is a records and information, or records and archives, department.

Who else manages the records? The survey further revealed that if there was one person assigned, the most typical person is the administrative assistant to chief and council, followed by the membership clerk or receptionist.

The survey confirms that most of the time, each department or each individual is left to manage the records on their own. The staff member may be organized and maintain a system, particularly if they are following rules established through a service agreement. However, usually, the records are maintained according to the individual’s own method and most are not inclined to manage records.

Most current records are now born digital, and most communities are able to use electronic communication. In the same survey, respondents indicated that there were electronic document/records management systems in place in 16 organizations.

Despite the use of computers and digital formats, staff members have difficulty locating the past records. Even in small organizations, when a person leaves their position, the successor staff member cannot find the records or make sense of them. Often the past files will be in various locations, with no uniform descriptions or finding aids to assist current staff to locate and use them.

So, we are left with the situation described by the leaders – the Grand Chief seeing the vision and practical value of records, and the leadership in communities hampered by not finding information.

The Future of Records and Information Management for First Nations

 Information governance principles indicate that for programs to be fully developed, there has to be accountability, transparency, integrity, protection, compliance, availability, retention and disposition in place. Few First Nations government programs currently in place meet the requirements of the Principles®.

Grand Chief John’s comments indicate that records and information are valued assets in First Nations governments. He was discussing a specific legal case, which foreshadowed the current situation with litigation, e-discovery and the need for governments to have reliable and authentic records to support the variety of legal cases going forward.

The support from leadership will help lead the way forward to fully developed programs in all governments.  However, from the words of the leadership to the operation of a fully developed program, there is a long pathway to be built. The support must be followed by specific funding and staffing to enable the program to develop, along with the other nine elements of government capacity16. Given the wide focus of their mandate, and the diverse body of information that a typical First Nation government is managing, a department consisting of trained records, information and archival professionals is required.

As new government bodies are established through treaty, the First Nations have a unique opportunity to build systems based on the principles, practices and procedures that have been developed by the Records and Information Management profession. A comparison with other types of government bodies in Canada should be made as funding is negotiated, to ensure that there is adequate support for their information assets.

Canadian records and information management professionals have been reaching out to the local First Nations communities to provide services and support. That support, as represented by the working relationship in British Columbia through the Memorandum of Understanding, has started a process to provide education and training initiatives. That support has also helped to raise the awareness of what is possible. We also have a unique opportunity to be advocates and leaders, assisting the First Nations staff to make the case and develop the strategic opportunities for records and information management programs.

 

Works Cited

12016 First Nations Records and Information Management Symposium, February 2, 2016. Musqueam Community Recreation Centre, Vancouver, B.C. Jointly sponsored by the First Nations Summit Society, Musqueam First Nation, ARMA Vancouver Chaper, Naut’sa mawt Tribal Council, and Collabware.
2The referenced case “R. vs. Guerin”, 1984, was the decision by the Supreme Court of Canada that established the Government of Canada’s fiduciary duty, a trust like relationship, to manage the financial affairs of the band. The details of this case involved an agreement to lease band lands for a golf course that was 10 percent of the original amount agreed by the Musqueam. Details of the changed terms were revealed to the Musqueam 12 years after the lease was signed. This case is also described as setting a precedent for aboriginal rights in Canada.
3Indigenous and Northern Affairs Canada. https://www.aadnc-aandc.gc.ca/eng/aboriginalpeoples/firstnations/
4Terminology. Indigenous and Northern Affairs Canada. Aboriginal Peoples and Communities. http://www.aadnc-aandc.gc/ca/eng.
5Ibid.
6Naut’sa Mawt Tribal Council. http://www.nautsamawt.org/. There are 11 member First Nations on both sides of the Georgia Strait between the British Columbia mainland and Vancouver Island.5 Ibid.
7Schwartz, Daniel “How does native funding work?” CBC News, February 06, 2013. http://www.cbc.ca/news/canada/how-does-native-funding-work/
8https://www.aadnc-aandc.gc.ca/eng. Accessed February 12, 2016
9Ibid.
10Based on the functions in the First Nations Information Management Toolkit, published by the First Nations Public Service Secretariat, Vancouver BC, 2011, and available for download at www.fns.bc.ca/fnps.
11Funding Arrangements. http://www.fnha.ca/what-we-do/funding-arrangements. Accessed February 12, 2016
12Data Management. http://www.fnha.ca/what-we-do/data-management. Accessed February 12, 2016
13Indigenous and Northern Affairs Canada. https://www.aadnc-aandc.gc.ca/eng/indianstatus/theindianregister
14Memorandum of Understanding between ARMA Canada Region, and ARMA Vancouver Chapter, and the First Nations Summit Society, and the British Columbia First Nations Public Service Secretariat (BCRNPSS), effective October 15, 2010.11
15Survey conducted by the First Nations Summit and ARMA Vancouver Chapter, November 2015, to obtain guidance on topics to present at the First Nations Records and Information Management Symposium, February 2, 2016.
16The Professional and Institutional Development program lists leadership, membership, law-making, community involvement, external relations, planning and risk management, financial management, human resources management, information management/information technology and basic administration as the ten core elements of governance.
2016 First Nations Records and Information Management Symposium. February 2, 2016. First Nations Summit Society, Musqueam First Nation, ARMA Vancouver Chapter, Naut’sa mawt Tribal Council and Collabware, sponsors.
First Nations Public Service. Information Management Toolkit for First Nations. 2011.
http://www.fns.bc.ca/fnps
Indigenous and Northern Affairs Canada. Aboriginal Peoples and Communities. http://www.aadnc- aandc.gc/ca/eng.
Saloman, Tamisha and Erin Hanson. R. vs. Guerin, 1984. Vancouver, BC. http://indigenousfoundations.arts.ubc.ca/home/land-rights/guerin. Accessed January 31, 2016
Schwartz, Daniel “How does native funding work?” CBC News, February 06, 2013. http://www.cbc.ca/news/canada/how-does-native-funding-work/

Some Personal Reflections on the History of RIM in Canada

by Jim Coulson, CRM, FAI

Many Records and Information Management (RIM) professionals know of the pioneering work of Emmett Leahy1, an American icon in the development of the life cycle approach to managing records and information. However, little has been published about the major contributions of Canadians to the field and profession of Records and Information Management.

The Public Archives of Canada2 pioneered records management concepts in the federal government going back to the 1930’s. However, the growth of modern records management in businesses in Canada began in the mid 1960’s. The Acme Seeley Company of Waterloo, ON invented and received a Canadian patent for associating colours and numbers – colour-coding of file folders with lateral filing was born! Meanwhile, new concepts for global organizations advanced as Canadian, Harold Moulds, was drafted to develop the first records management program for the United Nations in New York. The first Canadian chapter of ARMA was chartered in Montreal on December 16, 1968 and, a few days later, ARMA Toronto was chartered.

My personal story with RIM began in May 1970 in Canada. A summer job in a record center in Toronto turned into a full time position in the Records Management department at Canada Permanent Trust. My boss, Major George Mowat had brought his years of army quartermaster experience to this bastion of Bay Street and set about developing and implementing records retention schedules in all locations of the company. About the same time, programs like this and one led by Dick Parmenter of Inco, were emerging in major corporations across Canada.

By this time, the value of RIM was being acknowledged in the US, but many of the advances were happening in Canada. Colour-coding and lateral filing were significantly refined by Don Barber and Tom Scrymgeour of Datafile, Toronto, expediting its acceptance throughout Canada and the US. During the 1970’s and early 1980’s, virtually every North American insurance company, hospital, mortgage company, insurance agent, doctor and dentist office adopted the productivity and space savings of the Canadian invention of colour-coding with lateral filing.

Then, as the fledgling personal computer industry began to impact offices globally, Canada became a hotbed of innovation in automated records management software. Much of the credit for this advance in Canada was due to vendors being able to develop software that met standardized needs of the federal government. The “Block Numeric System” was adopted as a uniform file classification system for federal employees and spread to most Provinces. This spawned innovation like automated file tracking at Alberta Energy, Forestry, Lands and Wildlife under Records Manager, Norma Dalton; records management software advances led by Bruce Miller of Ottawa (Provenance and Tarian – which was later sold to IBM); and leading document and records management software companies in Canada like Hummingbird, PSS and OpenText.

ARMA International was not standing still during this period and some of its most significant advances were pursued by Canadians and occurred in Canada. It was in Toronto in 1975 that Don Barber and a group of Canadian and American records managers got together and negotiated the merger of what was known then as, the American Records Management Association (ARMA), with the Association of Records Executives and Administrators (AREA), into a new entity called the Association of Records Managers and Administrators – the predecessor of today’s ARMA International. That same year, Don Barber played a major role in bringing about the creation of the Institute of Certified Records Managers (ICRM).

In 1980, a landmark conference on Records Management in Canada was held at the Banff Springs Hotel. Bob Morin of Saskatoon, was a passionate RIM pioneer who brought records management professionals together from across the country to put on the first Canadian Records and Information Management conference. Two years later, the conference was held in Montreal, with presentations and proceedings in both official languages. The ARMA Canada conference has since become a highly respected annual event.

Another phenomenon in Canadian records management occurred in 1980 with the creation of a Special Committee on Records Retention (SCORR), chaired by Ted Hnatiuk, with Carl Weise and myself, all members of the ARMA Toronto chapter. Supported by ARMA chapters across the country, SCORR was successful in petitioning the federal government to produce a definitive list of federal regulatory requirements affecting records retention. The first “red book” of federal records retention regulatory requirements in Canada was published in November 1980.

The following year, the membership of the Toronto Chapter of ARMA doubled to almost 300 members as the legal community began to understand the inherent value of records retention and management. The SCORR initiative evolved into a national Canadian Legislative and Regulatory Affairs (CLARA) group in 1982. Their mandate was to broaden the conversation and pursue Privacy and Freedom of Information, Trans Border Dataflow and Admissibility of Evidence – all harbingers of what was to come. These Canadian successes were the forerunners of the popular Canadian, Washington and Global Policy Briefs published today by ARMA.

Whether it is the ARMA International presidencies of Christine Ardern, John Smith, Jim Spokes and Rick Weinholdt, the impact of Don Barber on colour-coding world-wide, the Australian mentorship of Pat Acton, the global reach of the InterPARES projects of Luciana Duranti, the national leadership of Judi Harvey, Sandi Bradley, Caroline Werle and Ted Ferrier, (without whom there would be no goosing ceremonies) or the National Archives outreach of John McDonald, Canadians have been at the forefront of innovation and leadership in records and information management. It is not surprising that there have been four Canadian Fellows of ARMA International3 and seven who have received the highest global award in the records and information management profession, the Emmett Leahy Award4.

Canadians should be very proud of the people, ideas and products that have been made, and continue to be made that have such a global impact on Records and Information Management.

 

Work Cited

1Further information on Emmett Leahy can be found at http://www.emmettleahyaward.org/leahy-bio.html.
2Dominion Archives was founded in 1872, became the Public Archives of Canada in 1912 and renamed the National Archives of Canada in 1987, then in 2004, a combined entity, named Library and Archives Canada (LAC).
3No. 7 Judi Harvey, No. 15 James Coulson, No.26 Christine Ardern, No. 47 Sandi Bradley
4Anneliese Arenburg 1989, Patricia Acton 1992, James Coulson 1997, John McDonald 1999, Christine Ardern 2002, Bruce Miller 2003, and Luciana Duranti 2006

DISPELLING MYTHS ABOUT RECORDS RETENTION IN CANADA

 By Stuart Rennie, JD, MLIS, BA (Hons.)

 

INTRODUCTION

I have spent over 20 years of my practice advising organizations about their information governance. As a lawyer and records management consultant, I have learned that there are many myths about records retention law within the records and information management (RIM) profession in Canada. These myths hamper a proper understanding of records retention and increase the risk that records managers are applying the wrong law or no law at all in their work and thus exposing their organizations to risk of legal liability. These myths persist even with a wealth of information in the RIM profession to prove them as patently false. I will consider 6 of these myths in this article. I will demonstrate how they are myths and why they should not be followed by Canadian records managers and information governance professionals.

The 6 myths are:

  • Myth 1: All Records Have A Legal Retention Period;
  • Myth 2: Business Records In Canada Have A 7-Year Legal Retention Period;
  • Myth 3: Legal Retention Periods In The United States Apply In Canada;
  • Myth 4: All Employee Pay Records Must Be Retained Indefinitely;
  • Myth 5: Since Emails are Transitory, Emails are Not Records and Not Part of A Records Retention Schedule; and
  • Myth 6: There Are No Legal Consequences For Destroying Records, With Or Without A Records Retention Schedule.

 

MYTH 1: ALL RECORDS HAVE A LEGAL RETENTION PERIOD

Some Canadian records managers mistakenly believe that all records have a legal retention period. A legal retention period is the period of time specified by law for how long an organization must ‘legally’ keep records. These retention periods are usually expressed in the law as the organization keeping records for a minimum period of years. These legal retention periods are commonly found after proper legal research in applicable statutes and regulations.

However, the majority of the statute law and regulatory requirements do not set out a legal retention period. Canadian law is replete with legislative requirements for organizations to create and maintain classes of records but provides no stated legal retention requirement.

But without a specified legal retention period, what is an organization to do?

We know from practice that the vast majority or records created are not permanently preserved. So, how long does an organization need to keep records to be legally compliant when there is no guidance in the law? Proper legal research is required to determine which records have or do not have legal retention requirements.

This legal research needs to find out where the law is. While most legal retention periods are found in statutes and regulations, retention periods can also be found in other sources of law. Canadian courts are a source. From the Supreme Court of Canada, the top court in Canada, down through to the provincial Courts of Appeal and Superior Courts, and even the Provincial Courts, these courts are a source of guidance for organizations to learn how to interpret the law that affects their records.

Another source are boards and tribunals. These include the federal, provincial and territorial Information and Privacy Commissioners and a wide range of administrative tribunals across Canada: human rights, labour relations, regulators of professions, property assessment, environment, workers compensation, to mention a few.

Yet another source of law are contracts, collective agreements, bylaws and policies.

Legal research needs to be conducted to show if that law even applies to the organization. Is the law in force? Has the law been changed? Repealed?

Organizations should conduct proper legal research annually since laws often change. An organization that relies on out-dated or repealed law is at risk of penalties for non- compliance.

Organizations need to conduct their own legal research or retain a qualified lawyer with experience in records management and records retention law. Providing a competent legal review for a records management and information governance system is a daunting task for the individual who is non-legally trained. Often even in-house legal counsel will defer this to other outside legal counsel who specialize in legal reviews of records retention schedules, because it is not their area of specialty and is very time consuming.

 

MYTH 2: BUSINESS RECORDS IN CANADA HAVE A 7-YEAR LEGAL RETENTION PERIOD

It is a common myth in Canada, that all business records have a 7-year retention period. This myth is operative even outside of the records management profession. For example, in 1995 in the debate in the British Columbia Legislative Assembly to impose a 7-year legal retention on payroll records, a Member of the Legislative Assembly referred to retaining records for 7 years as “the infamous seven-year rule”.1

Infamous or not, there is no blanket 7-year retention rule in Canada. The exact origin of the “7 year rule” for business records is not known. It appears the Canadian Income Tax Act is the source. Section of the 230(4)(b) of the Income Tax Act, requires specified tax records to be kept for a minimum of six years from the end of the last tax year to which these records relate.2 For individuals, the tax year is the calendar year; for corporations the tax year is the fiscal period chosen by the corporation.

The Income Tax Act legal retention periods only apply to income tax records for Canada. There are an enormous number of records that do not come under the Income Tax Act legal retention periods. The Income Tax Act legal retention periods do not apply to business records like: payroll records, contracts, memoranda, minutes, articles of incorporation, business licenses, client records, sales and marketing records and workers compensation records.

 

MYTH 3: LEGAL RETENTION PERIODS IN THE UNITED STATES APPLY IN CANADA

Due to the common use of free Internet search engines in our workplaces today, there is a myth that legal retention periods in the United States apply in Canada.

An example disproves the myth. Let’s imagine a records manager working for a non- profit organization in Ontario. The Ontario non-profit is also a registered charity with the Canada Revenue Agency (CRA) and actively seeks donations from the public as an integral part of its budget.

The Ontario non-profit records manager is not legally trained. Her manager asks her to create a records retention schedule. An ARMA International member, she attends her local Chapter meeting and asks about creating a records retention schedule. She is referred to ARMA International’s recent 2015 standard: Retention Management for Records and Information (ARMA International TR 27-2015).3 She purchases it and finds it very helpful with loads of new information.

 

TR 27-2015 defines “legal value” as the “usefulness of a record in complying with statutes and regulations, as evidence in legal proceedings, as legal proof of business transactions, or to protect an individual’s or organization’s rights and interests.”4 She finds further reference to how to determine legal value in TR-27-2015 by reviewing statutes and regulation, whether federal, state or local.

She, like many records managers is time and cash strapped, so she looks to the Google search engine for legal research to determine the legal value of her non-profit’s records to recommend the specific legal retention periods. She does a Google search using the search term “Records Retention And Disposition Guidelines”. That online search produces, on the first page of Google search results, a hit to the Smithsonian Institution Archives, “Records Retention And Disposition Guidelines” 2008.5

The Smithsonian seems a credible organization to her. The Smithsonian in Washington D.C. was established in 1846 on the National Mall, in Washington, D.C. It has become the world’s largest museum and research complex. It has grown to be world famous and a popular tourist destination.

On page 1 of these Guidelines, the Smithsonian says that these Guidelines may be “freely used and modified by any non-profit organization”. “Free is always good”, she says. The Guidelines appear to apply to a Canadian non-profit organization.

The Guidelines refer to a long list of American (USA) federal and state laws containing recordkeeping requirements. That would appear to meet the TR27-2015 requirements. The Guidelines are reasonably current she thinks, only about 8 years old.

In the section “How Long to Keep Records” is a long list of minimum records retention periods. These records start from accident reports and end at workers compensation.

For instance, donations are retained for 7 years.6 So far, so good. Compliance for the legal value of her organization’s records seems to be met.

She decides to use these legal retention periods for her organization’s retention schedules. Mischief managed, end of story, “what’s next on my To-Do list?” she wonders.

But there are four legal liabilities lurking in the background. First, the Guidelines are over 8 years old. Laws change; all of the time! Some laws change weekly; with different aspects being put into force and other aspects being repealed. Thus annual reviews of your records management programs are vital.

The Canadian Parliament and the provincial Legislative Assemblies usually meet at least once a year to debate, enact, amend and repeal laws. The volume of law can be very large. For example, in British Columbia in 2015 alone, the BC Legislature enacted 42 bills, from administrative law to workers compensation.7 Along the way, the BC government in 2015 also created, amended or repealed over 400 regulations; it put into legal force over 40 statutes, those statutes encompassing hundreds and hundreds of sections of statute law.8 While not all of that law affected records, that volume of law to review for what does apply to records is intimidating even to the seasoned legal researcher.

In an 8-year time span, likely laws affecting the Ontario non-profit have changed. As the law changes, that means the laws affecting the Ontario non-profit have changed too.

This then requires a review of the records retention schedule. For an organization to rely on law that has been repealed and not in force means the organization is not following the law. TR27-2015 recommends periodic review of the retention schedule to ensure it is current and complies with laws in force. Many organizations complete an annual review of the law that applies to their records to ensure they are not liable for not complying with the law.

The second problem is the engagement of the principle of state sovereignty. State sovereignty says that countries have the exclusive right to make laws within their own boundaries. Recently, in 2014, the Supreme Court of Canada explained that:

Sovereignty guarantees a state’s ability to exercise authority over persons and events within its territory without undue external interference. Equality, in international law, is the recognition that no one state is above another in the international order.9

An integral part of Canada’s state sovereignty, is that Canada has federal, provincial, territorial and municipal governments. Each government has its own exclusive jurisdiction as defined in the Constitution Act, 1867.10 The federal or Canadian government has responsibility for laws of national and international concern, like a national currency, defence, foreign trade, criminal law and citizenship. The provinces are responsible for local concerns like education, health, property and civil rights and municipal government.

Historically, jurisdictions across Canada have developed and revised their own records retention laws. These jurisdictions include the government of Canada, 10 provinces, 3 territories, First Nations and the thousands of local governments – big and small-across the country. These laws vary greatly. It is useful to understand to briefly review some of the laws that each of these jurisdictions can enact in order to understand how state sovereignty works. As well, we can see how these laws differ greatly, in both number and complexity.

 

Canada

Here is an example of time-specified retentions by the Canadian government for Health Canada. Health Canada has a mandate under the Food and Drugs Act to help Canadians maintain and improve their health. Requiring legal records retention periods ensures relevant records are retained and available for inspection and testing and that this government mandate is met. Specifically, the Blood Regulations under the Food and Drugs Act are intended to “promote the protection of the safety of Canadian blood donors and recipients in connection with the safety of blood for transfusion or for further manufacture into a drug for human use”.11 Sections 119 to 122 of the Blood Regulations set out, for different types of blood products, a complex matrix of required retention periods that vary from between 1 year to 50 years.12

 

Provinces

Here is an example of a Canadian province, which has a records retention law for archival purposes. In Ontario, Part III of the Archives and Recordkeeping Act, 2006 requires every public body to prepare a records schedule for each class of public records they create or receive, the records retention period and then the disposition of the records at the end of their retention period.13 Then, each records schedule must be approved by the Archivist of Ontario. Unlike Canada’s Blood Regulations, Ontario’s Archives and Recordkeeping Act, 2006 does not set out specific retention periods for specific government records. Instead the Archives and Recordkeeping Act, 2006 is focused, not on the records management requirements, but on the archival value of government records to ensure their long-term preservation.

 

Territories

Here is an example of a Canadian territory specifying specific records that must be retained. Nunavut’s Wildlife Act has not as complex a records retention requirement as the Canada’s Blood Regulations, nor an archival approach like Ontario’s Archives and Recordkeeping Act, 2006. Instead, the Nunavut government has taken a broad approach to records retention for wildlife matters in the territory. Section 187(4) provides that any record or document required to be kept under the Wildlife Act must be retained for a period of at least 2 years.14 These records are not just government records, but all records that come under the purview of the Wildlife Act.

 

First Nations

Here is an example of a First Nations Self-Government in the developmental stages of records management development. The Kwanlin Dün First Nation is located in Whitehorse, Yukon. The Kwanlin Dün First Nation has recently signed the Kwanlin Dün Final Agreement between it, the government of Canada and the Yukon Territory.15 The Final Agreement is comprehensive with provisions defining types of records as “Documentary Heritage Resources” and requiring specific classes of records like enrolment records to be maintained. The Final Agreement is silent on records retention but the Kwanlin Dün First Nation’s Corporate Services Department is responsible for effective records management of Kwanlin Dün First Nation’s records, including retention and disposition.16 As First Nations like Kwanlin Dün develop their legislative mandates, it is expected that they will enact records retention laws in the future.

 

Local Governments

Here is an example of a city creating a bylaw to address records management. The city of Surrey in British Columbia is the second largest municipality after Vancouver. Under the provincial Community Charter, the City Clerk is responsible for the preparation, maintenance, access and safe preservation of all City records.17 As well, the Freedom of Information and Protection of Privacy Act requires that the City must make every reasonable effort to assist applicants and to protect personal information by making reasonable security arrangements against such risks as unauthorized access, collection, use, disclosure or disposal.18 To meet these legal requirements under two different provincial statutes, the City of Surrey has enacted a records management bylaw.19 This bylaw authorizes the Surrey City Clerk to manage the records management system for the City. A policies and procedures manual is required. This manual must include specified provisions, including those for records retention.

Let’s review the implications regarding the Ontario non-profits records retention plan. The significance of this discussion is that Canadian law does not apply inside the United States and United States law does not apply inside Canada.

Applied to our example, the Smithsonian “Records Retention And Disposition Guidelines” applies only to US non-profits, not to Canadian non-profits. State sovereignty holds that Canadian or Ontario courts would not be bound in law to follow the American law upon which these the Smithsonian records retention periods are made. That means the Ontario non-profit is not able to prove it complied with applicable Canadian or Ontario law.

That leads us to our third and most concerning liability. Since the Ontario non-profit is using out-dated law, that is the wrong law from a foreign jurisdiction, its risk is great that it is liable to suffer penalties as a result. For example, the Smithsonian Guidelines provide a 7-year retention for donations after which it can be destroyed by the Ontario non-profit.

But the Income Tax Act of Canada requires that if the donor of property to the Ontario non-profit directs this donation is to be held by the Ontario non-profit for 10 years or more, then the legal retention period must be 2 years after the Ontario non-profit winds up its operations and/or its charitable status is revoked by the CRA.20 Wind-up and revocation might not occur for decades, if at all. If the Ontario non-profit destroys records evidencing this 10+ year donation, let’s say 7 years after it receives it following the American Guidelines, the Ontario non-profit is violating the Income Tax Act because the Ontario non-profit needs to retain these donation records for 2 years after the non- profit ceases operation and has its charitable status revoked. Since its charitable status has not been revoked, the Ontario non-profit wrongfully destroyed these donations records.

The penalties in the Income Tax Act of Canada that can be applied to the Ontario non- profit for this wrongful destruction are serious and multiple.

First, at any time, the CRA can inspect and audit all of the Ontario non-profit’s records, not just the donation records.21 An audit like this could alert the CRA to the Ontario non- profit’s destruction of the 10+ year donation records. An audit could alert CRA to other irregularities, increasing the Ontario non-profit’s legal liability.

Second, if the CRA finds out that the Ontario non-profit has improperly destroyed these records, the CRA can specify how and when the Ontario non-profit keeps its records.

CRA can further require the Ontario non-profit to make an agreement with the CRA. CRA then can conduct follow-up visits to the offices of the Ontario non-profit to ensure the Ontario non-profit is complying with the agreement.22 Time and money spent by the Ontario non-profit with the CRA complying with the illegal records destruction event is time not spent doing the good works and charity for which the Ontario non-profit was created.

Third, if the CRA finds out that the Ontario non-profit has lawlessly destroyed these records, it can also fine the Ontario non-profit. The fine is a minimum of $1,000 to a maximum of $25,000.23 CRA can go further and impose both the fine and a maximum 1- year in prison for the person at the Ontario non-profit who was guilty of destroying the records.24

Fourth, if the CRA deems the ill-advised records destruction as tax evasion, the CRA can launch a tax evasion prosecution. The penalties for tax evasion are a maximum fine of 200% of the tax owed and/or 2 years in prison.25 Tax evasion penalties are in addition to the penalties for records destruction.

Fifth, the CRA has the power to seek a compliance order from a Federal Court of Canada requiring the Ontario non-profit to abide by any agreements CRA requires. Failure of the Ontario non-profit to comply with a Federal Court order can result in contempt of court fines and even prison against the Ontario non-profit.

If faced with CRA’s three-pronged enforcement actions of audit, investigation and prosecution, the Ontario non-profit will likely obtain legal advice to represent itself. That legal advice comes with a hefty price tag, not to mention the time the Ontario non- profit needs to spend to inform and instruct its legal counsel, marshal its evidence and prepare to defend itself against the CRA in Federal Court.

These are just the legal liabilities. The Ontario non-profit will also have to face the wrath of the donor and the donor’s family, the public and the media. This single unauthorized records destruction event could cost the Ontario non-profit future donations that would seriously hamper its operations.

 

MYTH 4: ALL EMPLOYEE PAY RECORDS MUST BE RETAINED INDEFINITELY

There is no legal requirement in Canada for organizations to retain all employee records indefinitely. This applies to other business records as well. Canadian courts have held that such a requirement is unreasonable.26

Also, indefinite retention, without disposition, violates the life cycle of records. In the final cycle, records creators need to determine records disposition. They have to review the records and decide to: either retain the records permanently in an archive or destroy the records.

In a perfect world, at the moment of record creation, the record creator should consider the disposition of the record and appraise the record: either to be disposed after legal and operational requirements are completed (with a defined retention period) or preserved permanently as authentic evidence and heritage of the organization’s activities.

There are principles that records managers in Canada can use to effective apply records retention. For example, ARMA International has the Generally Accepted Recordkeeping Principles® (the Principles).27 The Principles set out “the critical hallmarks of information governance and provide both a standard of conduct for governing information and metrics by which to judge that conduct.”28

There are eight Principles: Principle of Accountability, Principle of Integrity, Principle of Protection, Principle of Compliance, Principle of Availability, Principle of Retention, Principle of Disposition and the Principle of Transparency.

For the Principle of Retention:

[A]n organization shall maintain its records and information for an appropriate time, taking into account its legal, regulatory, fiscal, operational, and historical requirements.29

The Principle of Retention is silent on what are the specific legal, regulatory, fiscal, operational, and historical requirements. These requirements can cover a vast range: in what legal jurisdiction(s) is the organization is doing business and the complexity of the recordkeeping system employed by the organization. Complete legal research is required to ensure that the organization has the current relevant records retention law at hand.

ARMA International’s Principle of Retention is closely allied to the Principle of Disposition.

The Principle of Disposition provides that:

[A]n organization shall provide secure and appropriate disposition for records and information that are no longer required to be maintained by applicable laws and the organization’s policies.30

For employee pay records, they typically are: employee personal information like name and address, hours worked, wage rate and calculation of wages and any overtime paid, deductions and allowances.

Statutes typically contain the legal retention periods for these records series. Statutes differ jurisdiction to jurisdiction across Canada. For example some jurisdictions require these records to include payroll information but information about leaves (like compassionate leave, reservist leave, maternity and parental leave) and vacations.

Let’s examine how the 10 provinces from coast to coast in Canada treat records retention for employee pay records.

 

British Columbia

British Columbia requires employers to retain employee pay records for 2 years after employment terminates.31 British Columbia does not require employers to retain records of leaves.

 

Alberta

Alberta is different from British Columbia. Alberta requires employers to retain employment record for at least 3 years from the date each record is made.32 The employment record includes leaves and vacations.

 

Saskatchewan

Saskatchewan requires employers to retain employee pay records for the most recent 5 years of the employee’s employment and for 2 years after employment ends.33 Vacation records are included in the pay records requirement, but not records of leaves.

 

Manitoba

Manitoba is the same as Alberta, 3 years.34 Both Alberta and Manitoba require retention of payroll information and records of leaves and vacations.

 

Ontario

Ontario has the most complex legal retention requirements for employee pay records in Canada.35 Ontario requires either the employer to retain the following records or to pay for someone else to retain these records.

Records of the employee’s name and address and date employment must be retained for 3 years after employment terminates.

Records of the employee’s date of birth, if the employee is a student and under 18 years of age, must be retained the earlier of 3 years after the employee’s 18th birthday or 3 years after employment terminates.

Records of the number of hours the employee worked in each day and each week and overtime must be retained for 3 years after the day or week to which the information relates.

Records of employee wage statements, wages due on employment termination and vacation must be retained 3 years after the information was given to the employee.

Like Alberta and Manitoba, Ontario requires that records of leaves and vacation must be retained for 3 years.

 

Quebec

Quebec requires employers to retain payroll information for a given year for a 3-year period.36 Records of vacation, but not leaves, are included in payroll records.

 

New Brunswick

New Brunswick, like Ontario, permits employers or someone retained by the employer to keep payroll records. Payroll records must be retained for at least 3 years after work is performed.37 Records of leaves and vacation are included in payroll records.

 

Nova Scotia

Nova Scotia requires employers to retain payroll information and records of leave and vacation for at least 3 years after the work was performed.38

 

Prince Edward Island

Prince Edward Island requires employers to retain payroll records for 3 years after the work was performed.39 Vacation records are included, but not leave records.

 

Newfoundland and Labrador

Newfoundland and Labrador requires employers to retain employee payroll and vacation information for 4 years from the date of the last entry in the record respecting the employee.40

 

Triggering Event

The triggering event to start the retention period running depends on the province. There is no harmonized triggering event across Canada.

For British Columbia, the triggering event is the employment termination date; for Alberta and Manitoba it is when the record was made; for Ontario and Saskatchewan it is either termination date or when the record was made depending on the type of record; for New Brunswick, Nova Scotia and Prince Edward Island it is neither termination date, nor when the record was made but when the work was performed; for Newfoundland and Labrador it is from the date of the last entry in the record respecting the employee. For Quebec, it is the given year.

Records managers need to be mindful that the language of the statute determines the event that triggers the running of the retention period.

All of the provinces do not mandate maximum retention periods. Neither do the provinces require destruction of employee pay records after the legal retention period has expired. The minimum legal retention periods range from 2 to 5 years depending on the province. The law is flexible in this regard. Records managers can take advantage of this flexibility if it benefits their organizations to retain these records longer if it legal risk to do so is reasonable and the organizations need these records for business purposes.

 

MYTH 5: SINCE EMAILS ARE TRANSITORY, EMAILS ARE NOT RECORDS AND NOT PART OF A RECORDS RETENTION SCHEDULE

This myth has several elements: emails are transitory, emails are not records and thus emails do not form part of a records retention schedule. All these elements are myths.

An instructive case to prove this myth false is a recent one arising out of a report from Elizabeth Denham, British Columbia’s Information and Privacy Commissioner (the “Commissioner”). The Commissioner provides independent oversight and enforcement of B.C.’s access and privacy laws, including the Freedom of Information and Protection of Privacy Act.

 In October 2015, the Commissioner released a report: Access Denied: Record Retention And Disposal Practices Of The Government Of British Columbia.41

The facts of the case are unique.

In 2014, a former employee of the Ministry of Transportation and Infrastructure complained to the Commissioner that government emails were improperly destroyed responsive to an information access request about government meetings regarding the provincial Highway 16/the Highway of Tears. The Highway of Tears is a 700 kilometre portion of Highway 16, located between Prince George and Prince Rupert, in northern British Columbia. On the Highway of Tears a number of women have tragically disappeared.

The Commissioner’s office investigated. After initial investigation, her office then expanded the investigation to also include the Ministry of Advanced Education and the Office of the Premier.

In her report, the Commissioner found that government employees had improperly destroyed emails, lied under oath and that the Premier’s Office managed requests for records without documentation and these requests were not processed in a timely manner. As a result, the Commissioner found that the government had violated its legal duty to assist an information access request under the Freedom of Information and Protection of Privacy Act.

The Commissioner observed that:

I am deeply disappointed by the practices our investigation uncovered. I would have expected that staff in ministers’ offices and in the Office of the Premier would have a better understanding of records management and their obligation to file, retain and provide relevant records when an access request is received.42

The Commissioner’s observation show how deep this myth is, the myth that email is not a record. British Columbia has had the Freedom of Information and Protection of Privacy Act in force since 1993. To have over 20 years elapse and still those in government have little understanding about records retention is both disappointing and surprising.

But this false belief does not reside only in British Columbia. In my experience, belief in this myth is common across Canada.

The Commissioner recommended the BC government take corrective action, including: mandatory training in records management, including training on what is a transitory record and what is not, to ensure that employees follow correct processes when responding to access to information requests.43

After the Commissioner released her report, the BC government retained its own expert to provide recommendations on how to implement the Commissioner’s report. This expert was David Loukidelis, QC, the former BC Information and Privacy Commissioner.

In December 2015, the government’s expert released his report (“Loukidelis Report”). 44

The Loukidelis Report frames the issue of emails and their relationship to RIM: The records management and archival implications of modern electronic communications media are indeed daunting. It is difficult to understate the challenges such phenomena present for records and information management, and archives, in the electronic age.

The situation in British Columbia illustrates this. The provincial government’s Office of the Chief Information Officer (OCIO) has advised that some 284,000,000 emails are received by the provincial public service each year, with approximately 86,000,000 being sent each year. The storage space for received emails alone amounts to some 43 terabytes of data annually, with roughly 13 terabytes being required to store sent emails. This is apart from the doubtless staggering volume of other electronic information and records created each year. This matters, obviously, because, if records cannot be found because they have not been properly managed and retained in electronic form, important public interest objectives will be harmed. So will the public’s rights of access to records. [footnotes omitted]45

Many of the recommendations in the Loukidelis Report agree with those in the Commissioner’s report. The BC government has stated it will implement all of the recommendations in the Loukidelis Report.46

Reading both reports from these two privacy experts—one a sitting Privacy Commissioner, trained as an archivist and the other a former Privacy Commissioner, trained as a lawyer—affords insights to answers to the question “what is a transitory record?”

If we pause and look outside the British Columbia Commissioner’s report and the Loukidelis Report, for a definition, we don’t have to go far.

In the American Society of Archivists’ A Glossary of Archival and Records Terminology, “transitory record” is “a record that has little or no documentary or evidential value and that need not be set aside for future use.”47

In the Canadian General Standards Board, Electronic Records as Documentary Evidence standard, a “transitory record” is a “record that is required only for a limited time to ensure the completion of a routine action or the preparation of a subsequent record.”48

In Newfoundland and Labrador’s Management of Information Act, a “transitory record” “means a government record of temporary usefulness in any format or medium having no ongoing value beyond an immediate and minor transaction or the preparation of a subsequent record. ”49

The common element in these definitions is that transitory records are only of temporary value. These definitions focus on the content of the transitory record; only the Management of Information Act mentions form.

Examples of common transitory records include: copies, drafts, notices, informational materials, advertisements like “junk mail” and unsolicited records.

In the RIM profession, transitory records are routinely destroyed to reduce record volume and the cost and time associated with managing records with temporary and limited value.

Back to the British Columbia Commissioner’s report and the Loukidelis Report, on the issue of email and transitory records, in her report, the Commissioner found that:

In conducting this investigation, it has become clear that many employees falsely assume that emails are impermanent and transitory, and therefore of little value. What this investigation makes clear is that it is a record’s content and context that determines whether a record is transitory, rather than its form.50

But, as a general proposition, is all email transitory? In both the Commissioner’s report and the Loukidelis Report, the conclusion is that all email is not transitory because it depends on the content and context of the email whether the email is transitory or not.

The definition of “record” in the Freedom of Information and Protection of Privacy Act supports this emphasis on the content and context of email, not email’s format: “record” includes books, documents, maps, drawings, photographs, letters, vouchers, papers and any other thing on which information is recorded or stored by graphic, electronic, mechanical or other means, but does not include a computer program or any other mechanism that produces records;51

As the Loukidelis Report observes, consistent with the Commissioner’s conclusions, that: [T]his non-exhaustive definition requires that information be “recorded or stored” by some “means”. It is beyond debate that electronic records, including emails existing only in electronic form, are records. Information in an email or an email string is electronically recorded or stored and is thus a record.52

Beyond debate outside the privacy world, and beyond debate extended to the justice system in general. For example, all of the Evidence Act statutes in Canada, be they federal, provincial and territorial, employ similar non-exhaustive definitions of a “record”. In doing so, these statutes permit email to be a record on its own to be

admitted in evidence in legal proceedings across the country. They also permit email to stand on its own, in place of paper records.

As a result, email in Canada can be a record, and like paper records, email has a proper home in a records retention schedule.

 

Myth 6: There Are No Legal Consequences For Destroying Records, With Or Without A Records Retention Schedule

A final myth is that there are no legal consequences for destroying records with or without a records retention schedule. The basis of this myth, in my experience, is the bogus belief that records are “just filing”. Records are not seen as an organization’s key information assets, which they are. Records are not considered evidence of an organization’s legal and business transactions, which they are.

In truth, there are serious legal consequences for destroying records, with or without a records retention schedule.

The term lawyers and judges use for illegal destruction of records is “spoliation”. The current statement of the spoliation in Canadian law is found in the 2008 Alberta Court of Appeal case, McDougall v. Black & Decker Canada Inc. 53

In that case, the litigation between the parties focused on a house fire in the home owned by the McDougall family. The fire department determined that the fire was caused by either carelessly disposed smoking materials or a malfunctioning cordless electric drill manufactured and distributed by Black & Decker. The McDougalls sued Black & Decker for the loss of their home. When litigation was started, the McDougall’s house was replaced by a new home. Parts of the McDougall’s drill in possession of the McDougall’s insurance company’s investigator went missing and were never found.

Black & Decker applied to have the lawsuit dismissed because of spoliation. It claimed it was unable to defend itself in court to show how the fire really began. It could not investigate the fire scene because it was now a new home. It could not investigate the drill because it was missing. The Alberta Court of Appeal found there was insufficient evidence to prove spoliation but directed a new trial. At the new trial, it gave Black & Decker the right to examine the insurance company’s investigator who had examined the drill before the evidence went missing.

In coming to its decision, the Alberta Court of Appeal reviewed the law of spoliation in Canada. The Alberta Court of Appeal found that spoliation “occurs where a party has intentionally destroyed evidence relevant to ongoing or contemplated litigation in circumstances where a reasonable inference can be drawn that the evidence was destroyed to affect the litigation”.54 It also found that, if spoliation occurs, the principal remedy is to presume as fact that spoliation is not to assist the spoliator and the courts, in their Rules of Court, have a variety of remedies to use to not assist the spoliator.55

The Rules of Court across Canada give the courts broad discretion to apply a number of remedies to deny the spoliator the fruits of his or her evidence destruction.

Courts may refuse to admit documents into evidence. In addition, courts have the power to detain, take custody or preserve evidence. Courts may draw an adverse inference against a party guilty of spoliation to find as facts certain evidence against a party who spoliates. Courts may refuse to hear witnesses or permit a spoliator to examine or cross-examine witnesses. Courts may impose costs against a party who engages in spoliation. Courts may also levy contempt of proceedings orders against spoliators. In addition to these court sanctions, organizations may face further court proceedings for fraud or other criminal conduct arising out of the spoliation. Most rules of court permit a default judgment, without a trial, to be entered against a defendant who destroys evidence. Similarly, most rules of court permit the dismissal of a legal action when a plaintiff commits spoliation.

In addition to these serious legal consequences, an organization may also suffer losses from bad publicity, loss of business or loss of reputation if it is found guilty of spoliation. As well, there is the time and money an organization must expend in order to defend court actions for claims it spoliated. This is time and money not spent on furthering the organization’s mission or purposes.

Given the wide range of sanctions, as noted above, there are serious legal consequences for destroying records, with or without a records retention schedule. Compliance with records retention law is a “shall”, not an option.

 

CONCLUSION

After you read this article, I hope you are convinced that these myths, if they live in your organization, need to be dispelled. I hope my article gets you to ask yourself questions about the efficiency and effectiveness of your RIM program. Questions like:

  • Who in my organization should I give this article to read?
  • Is my organization’s retention schedule legally compliant?
  • How do I conduct legal research so I know what legislation has changed that affects my organization’s records?
  • Do I need to conduct a legal review of my retention schedule? How do I do that?
  • When was the last annual review of my retention schedule completed?
  • Does my retention schedule include email?
  • Does my organization need a RIM bylaw? How do I make a bylaw?
  • Does my organization need RIM policies and procedures? How do I write them?
  • Does my organization need more RIM training? On records retention? On transitory records?
  • Is my organization retaining too many records?
  • Is my organization doing disposition? Doing disposition properly?
  • Is my organization at real risk of audit, investigation or prosecution and fines or prison for noncompliance with the law?
  • Where do I get a RIM lawyer?

Once you ask yourself these questions and think through your answers, I hope you start a conversation within your organization about RIM so you can help your organization get on track to using your records more as information assets with improved legal compliance, risk control and greater peace of mind.

 

Works Cited

1Page 15611, Hansard (June 15, 1995)(Volume 21, Number 5)(https://www.leg.bc.ca/documents- data/debate-transcripts/35th-parliament/4th-session/h0615pm2#15624).
2See http://laws.justice.gc.ca/PDF/I-3.3.pdf.
3See Bookstore at http://arma.org/.
5See http://siarchives.si.edu/cerp/RECORDS_RETENTION_SCHEDULE_rev3.pdf.4 Page 4.
6Page 5
7See http://bit.ly/1KPn532.
8See http://www.qp.gov.bc.ca/statreg/bulletin/bull2015/cumulati.htm.
9Kazemi Estate v. Islamic Republic of Iran, [2014] 3 SCR 176, 2014 SCC 62 (CanLII) at para. 35 per LeBel J. (http://canlii.ca/t/gdwht).
10See http://www.justice.gc.ca/eng/csj-sjc/just/05.html.
11Ministry of Health, Guidance Document: Blood Regulations (Effective date October 23, 2014)( http://www.hc-sc.gc.ca/dhp-mps/alt_formats/pdf/brgtherap/applic-demande/guides/blood-reg-sang/blood- guid-sang-ligne-2014-10-23-eng.pdf).
12See Blood Regulations (SOR/2013-178)(http://laws.justice.gc.ca/eng/regulations/SOR-2013-178/).
13See sections 11 to 16 (http://canlii.ca/t/l33t).
14See http://canlii.ca/t/51x1n.
15See http://www.kwanlindun.com/uploads/Final_Agreement.pdf.
16See page 4 of the Kwanlin Dün First Nation Annual Report 2014-2015 (http://www.kwanlindun.com/uploads/KDFN_Annual_Report_2014_15_WEB.FNL_.pdf)
17See section 148 (http://www.bclaws.ca/civix/document/id/complete/statreg/03026_00)
18See sections 6(1) and 30 (http://www.bclaws.ca/civix/document/id/complete/statreg/96165_00)
19See Surrey Corporate Records By-law, 2010, No. 17002 (www.surrey.ca/bylawsandcouncillibrary/BYL- 17002-1D94.pdf)
20See section 5800(1)(d) of the Income Tax Regulations and Canada Revenue Agency, Books and Records Retention/Destruction (IC78-10R5)(June 2010)(http://www.cra-arc.gc.ca/E/pub/tp/ic78- 10r5/README.html).
21Section 231.1.
22Section 230(3) and IC78-10R5)(June 2010) at page 4.
23Section 238.
24Ibid.
25Section 23.
26See Moutsios c. Bank of Nova Scotia, 2011 QCCS 496 (CanLII)(http://canlii.ca/t/2fpvn).
27See http://arma.org/r2/generally-accepted-br-recordkeeping-principles.
28Page 2 in ARMA International’s Information Governance Maturity Model (http://arma.org/docs/bookstore/theprinciplesmaturitymodel.pdf?sfvrsn=2)
29Principles.
30Ibid.
31Section 28(2) of the Employment Standards Act, R.S.B.C. 1996, c. 113 (http://www.bclaws.ca/civix/document/id/complete/statreg/96113_01)
32Section 15 of the Employment Standards Code, R.S.A. 2000, c. E-9 (http://canlii.ca/t/52bwr). 33 Section 2-38 of the Saskatchewan Employment Act, SS 2013, c S-15.1(http://canlii.ca/t/52kp9). 34 Section 135(3) of the Employment Standards Code, C.C.S.M. c. E110 (http://canlii.ca/t/52ktp).
35Sections 15, 15.1 and 16 of the Employment Standards Act, 2000, S.O. 2000, c. 41 (http://canlii.ca/t/52k2z)
36Section 2 of the Regulation respecting a registration system or the keeping of a register, C.Q.L.R. c. N- 1.1, r. 6 (http://canlii.ca/t/hnx0).
37Section 60 of the Employment Standards Act, S.N.B. 1982, c. E-7.2 (http://canlii.ca/t/52cnh).
38Section 15 of the Labour Standards Code, R.S.N.S. 1989, c. 246 (http://canlii.ca/t/524c4).
39Section 5.6 of the Employment Standards Act, R.S.P.E.I. 1988, c. E-6.2 (http://canlii.ca/t/52k23).
40Section 63 of the Labour Standards Act, R.S.N.L. 1990, c L-2 (http://canlii.ca/t/526fq)
41 Investigation Report F15-03(October 22, 2015)(https://www.oipc.bc.ca/investigation-reports/1874)
42Supra at page 3.
43Supra at page 57.
44David Loukidelis, QC, Implementing Investigation Report F15-03 Recommendations To The Government Of British Columbia (December 2015)(www.cio.gov.bc.ca/local/cio/d_loukidelis_report.pdf).
45Supra at pages 6 and 7.
46Office of the Premier, “Premier’s statement on freedom of information and records management improvements”(December 16, 2015)(http://bit.ly/1RDU1gv).
47See http://www2.archivists.org/glossary.
48CAN/CGSB-72.34, page 13 (http://www.techstreet.com).
49Section 2(h) in S.N.L. 2005, c. M-1.01(http://canlii.ca/t/k03h)
50Supra at page 49.
51Schedule (http://www.bclaws.ca/civix/document/id/complete/statreg/96165_00).
52Loukidelis Report at page 10.
532008 ABCA 353 (CanLII)(http://canlii.ca/t/21bl9).
54Supra, at para. 18.
55Supra, at para. 29.

The History of Records Management in Canada, 1867 – 1967

By Uta Fox

This paper surveys the development of records management in Canada. It focuses primarily on the federal government as well as a number of provinces. In implementing records management into the Canadian government a unique approach was developed. Ian Wilson, Librarian and Archivist of Canada1, commented that government (federal, provincial, and municipal) archives “… preserve not just the official administrative records but also acquire private materials in all documentary media bearing on history … and combine the traditional role of a record office with that of an active cultural agency…” (p.16). In other words, Canadian governments have adopted an integrated archival records management approach to the management of government archives and records.

According to Mark Langemo, records management originated in the US Federal government during the late 1940s, evolving from the U.S. archival profession. The US National Archives was established in 1934 to handle the past accumulation of federal documentation and the increasing volumes of records generated by the US federal government. In response to this growth President Harry S. Truman established the Commission on the Organization of the Executive Branch of the Government, which became known as the Hoover Commission in the late 1940s. A Task Force on Paperwork Management was established and Emmett J. Leachy was selected as chairman. The Hoover Commission defined the term “records management” in the late 1940s (Langemo, p.2-3). It is important to note that this is the first time this term was used, prior to this what we know to be records management, was referred to as “paperwork.”

In Canada the federal government’s records management programmes also derived from archives. To understand the origins and development of records management requires an understanding of the development of the Canadian archival tradition. And in Canada the responsibility for collecting its historical records, both public and private, fell to the government. (Wilson, p.15) In other words, collecting Canadian history and the records that documented that history was viewed as a public responsibility.

The first efforts to acquire historical or archival records originated from the Literary and Historical Society of Quebec, established in 1824. Following Confederation a more formal arrangement for Canadian historical records was desired and Cabinet, in 1872, created an “Archives Branch” in the Department of Agriculture, which was the department responsible for arts and statistics (Wilson, p.22). Journalist Douglas Brymner was appointed the first archivist and given a budget of $4000, three empty rooms, and very vague instructions (Atherton, p.86). Emphasizing collecting and copying2 Brymner focused on acquired pre- Confederation records . He was not concerned with the preservation of current government records (Millar, p.108).

According to Jay Atherton, the Post Office, in 1889, became the first department to experience a “records management” problem. The Postmaster General requested from Cabinet a standard five-year retention period for routine financial records. Looking to Britain for advice for “the weeding of public documents”

resulted in the Post Office’s schedule being amended so that more valuable documents were retained longer than those with less value. As well, Cabinet authorized the destruction of records mentioned in the amended schedule after their specified retention was achieved. As Atherton notes, Cabinet had approved the first records schedule in the Canadian government (Atherton, p.87). And while other departments destroyed records irregularly after consulting with the Treasury Board, the practice was neither consistent nor systematic.

Interestingly, as the Archives Branch was created by Cabinet, the Department of the Secretary of State, which was responsible for “keeping all State records and papers not specially transferred to other Departments,” created the Records Branch of the Department of Secretary of State. The Records Branch was concerned with the government’s administrative records and “The safe keeping and classification of the archives” (Wilson, p.22). Henry J. Morgan became Keeper of the Public Records which resulted in two agencies responsible for collecting historical public records. Yet, little effort was expended on the current government records. Government departments showed little enthusiasm in transferring records either to the Keeper or the Archives Branch and the majority of government records still lay in attics or basements of government buildings on and around Parliament Hill (Atherton p.87).

It fell to Joseph Pope, former secretary to Sir John A. Macdonald and Under Secretary of State in the Laurier government to bring attention to the duplication, confusion and expense of two rival agencies as well as the predicament of the historical public records. He proposed to consolidate the Archives and Records branches into a public record office (Wilson, p.23). Ironically, a short time later, a fire in the West Block in 1897 destroyed an entire floor and its contents, emphasizing Pope’s concerns which lead Cabinet to appoint a Commission to study the “periodical destruction of such papers and vouchers as may be deemed useless and which are merely encumbering the vaults…” (Atherton, p.88).

Following an inspection of the records in all the departments the Commission made a number of recommendations, one of which was the amalgamation of the Archives and Records Branches. Additionally, it suggested that:

  • The government build a fire-proof building known as the Records Office which would function as a repository for the archives
  • A standard ten year retention period be adopted for routine financial documents
  • The departments review filing systems to determine records of no value which the Commission felt was the majority of the documentation
  • The departments should be allowed space in the Records Office to keep their “more recent records” and retrieve them when

Clearly we are beginning to see an awareness of some of the components of a records management programme. As Atherton notes, had the government implemented the Commission’s recommendations the government would have established “rudimentary form of central control over records disposition” as well as a single Public Record office for storage of both current and historical records (Atherton, p.91). However, even though the Commission’s findings were not enacted upon, it did articulate and identify components of a records management programme and provide a reference document for future Commissions’ perusal.

While the pace towards a records management programme was slow developments continued. An Order in Council, in 1903, amalgamated the positions of Dominion Archivist and Keeper of the Record, an Archives building was promised, all historical files (except the Privy Council Office) designated by the commission were ordered transferred to the Archives “…for greater safety in their preservation…” and Arthur G. Doughty was appointed Dominion Archivist and Keeper of the Public Records (Wilson, p.24). As Jay Atherton notes, the government approved the transfer of department records to the Archives to be preserved, arranged and made accessible for historians, thus in effect recognizing the historical and research value of these records.

For the government departments though, transferring records to the Archives was voluntary. Furthermore, other records management principles not addressed were – a mechanism for the immediate destruction and disposal of useless documents, a standard retention period for routine financial records, a review of filing systems in department and a fixed age for transfer of records to the Archives. Records continued to accumulate at the Archives but it was through the Archives’ collection policies, not the transfer of government records, that the holdings grew.

In 1912, the Public Archives Act created the Archives as separate department under the Secretary of State and Douglas Doughty became the Dominion Archivist. Cabinet was now authorized to remove public records and historical material from the custody of the various government departments to the Archives building that was completed in 1906. However, in terms of records management criteria, the act “did not explicitly ensure the preservation of public records in offices of origin, nor did it provide for their orderly disposal under the supervision of Archives staff” (Atherton, p.92)

While the Archives continued collecting both public and private records, a policy for transferring records from the various government departments to the Archives did not exist. Nor was there a system to manage the current government records. At Doughty’s urging, another Royal Commission was establishment in 1912 to study and explore the “state of the department records” and while it too recommended establishing a public records office comparable to a records centre as part of the Archives, the First World War deferred this development (Wilson, p.32). An active Public Records programme eluded Doughty throughout his tenure as Dominion Archivist.

Records volumes increased dramatically during and following the Second World War. Additionally, huge growth in records activity coupled with the ability to create records more rapidly through such technologies as the typewriter and microfilm greatly expanded the governments’ holdings. The Royal commission on the National Development in the Arts, Letters and Sciences, 1949-1951, known as Massey Commission, had, according to Laura Millar, a “strongly nationalistic imperative” in that it wanted to change Canada’s cultural environmental (p.114).

The Commission also addressed the state of federal records. It did applaud the government’s establishing a Public Records Committee in 1945. This committee, under the auspices of the Secretary of State, had the Dominion archivist as Vice Chairman. The purpose of the committee was to reduce the “vast paper burden” within government, where “decades of old files were “moldering in damp cellars” and identify and transfer those government records having historical value to the Archives (Cook, p. 206).

But, in addressing the Public Archives, the Commission discussed the Public Records problems at length. Referring to past Royal Commissions (1897, 1912) that studied the Public Records the Commission noted that these past Commissions “labored almost if not altogether in vain” (Massey, p. 113) since the majority of their recommendations were not enacted upon by the government. It criticized the government, the civil service, and the Archives for the fact that the government records were “in a state of chaos” as records were scattered around Ottawa in inactive or dead files (Cook, p.207). Furthermore, the Commission addressed the lack of economy in recordkeeping, pointing to the cost of $175,000 spent annually to store records which “can probably be classified as dead, in that they have no further administrative or historical usefulness” (Massey, p.114).

Some of the Commission’s recommendations included a review and clarification of the current regulations governing the disposal of public documents; implementing provisions for systematic and continuous transfer of inactive records to the Archives; delegating records destruction authorization to the Public Records Committee; and hiring and training properly qualified records officers.

Fortunately, at this same time W. Kaye Lamb became the fourth Dominion archivist of Canada (1948-1969). Under his tutelage records management became firmly entrenched in his efforts to modernize the Public Archives. He firmly favoured the American practice of a records centre “as a half-way house, or cooling-off place” for those records between active use in departments and their final disposition of destruction or archival transfer. In 1956 the Public Archives Records Centre (PARC) opened at Tunney’s Pasture, crown-owned land two miles west of Parliament Hill and adjacent to the Ottawa River. Departments began to transfer their records and an efficient system established for accessioning, listing and storing inactive records, accurate statistics were kept and a reference service established which prided itself “… on delivering any file to the department requesting it within three hours of the receipt of the request” (Ormsby, p.40) However, the years of records neglect proved to be challenging. Some of the documents were in such bad shape that they literally were handled with shovels, while others were infested with silverfish (Cook, p.210). PARC soon had rooms for receiving, cleaning and sorting records, a fumigation chamber, offices, reference rooms and a research room (Ormsby, p.40).

In addition to providing valuable records storage, the PARC was a control mechanism that ensured only scheduled records were transferred and it functioned to identify those records whose value had ceased as well as those with historically value. The Glassco Commission, or the Royal Commission on Government Organization, 1960-63, also endorsed the Records Centre concept urging it to be the focus of any records management programme. It suggested establishing regional Records Centres in areas where it was cost effective. Both Toronto and Montreal soon had federal Records Centres.

Soon after PARC opened the Central Microfilm Unit was moved to the Public Archives; a Records Management Survey Committee conducted a survey of records management in the federal departments; in 1961 the first month-long records management course was held and in 1966, the Public Records Order abolished the Public Records Committee and gave the Dominion Archivist sole authority over disposal of federal public records and the responsibility for coordinating the government records management programme (Atherton, p.105).

Lamb saw to it that microfilm, with its many advantages, became part of the archival records management programme. Regional government offices opened across the country and microfilm provided efficient access to duplicate copies of government records. Security was another feature particularly in the 1950s with the threat of nuclear war. In fact, in 1959, the Public Archives was “assigned core responsibility to operate a new “essential records” program across the entire government” (Cook, p212). Once processed, microfilm was stored in secret sites but available if “a major disaster, either natural or nuclear occurred” (Cook, p. 212) and a Vital Records programme was in effect.

On the provincial scene, the development of records management in Ontario’s provincial government paralleled the federal government in that the records management impetus came from the Archives. Following World War II, the provincial records situation was uncoordinated, decentralized, and unmanaged.

With individual provincial departments controlling their recordkeeping, increasing volumes in inactive and dead records adversely effecting retrieval, and a microfilm programme lacking cohesion, the Ontario government failed to view record and recordkeeping as an important element in government operations .

Threatened with closure shortly after the Second World War, the Ontario Archives, under the direction of George Spragge, Archivist of Ontario (1950-1963), embarked on an “archival records management” strategy (Craig, p.10). However, Spragge drafted retention schedules, lobbied for uniform procedures for systematic records disposal, and suggested the concept of a Public Records Centre but it was not until 1965, with the publication of the Moore Report that an archival records management programme was established in that province.

While requests for records disposal were subject to the Archives Act, the Archives themselves wielded little authority. One of the problems in Ontario was the lack of influence of the Archives. While the Archives Act of 1923 gave the Archivist of Ontario status as a deputy head of department, the Archives themselves were viewed “…peripheral to government, an antiquarian organization dealing with the past as a service to small groups of scholars (Craig, p.4-5). Indeed the Treasury Board’s Secretariat was pivotal to the government’s administration.

The Moore Report of 1965recommended that “Records Management must be “treated as an integral and essential part of efficient administration and not as an end in itself” (Craig, p.17). It reiterated previous records management requests one of which was the Records Centre. But in fulfilling that service the government had to resort to outsourcing. Lacking qualified staff an agreement was made with Harold Moulds of H.M. Record Services who, in addition to providing the Records Centre and Records Centre services, also assisted in developing classifications for department records, trained staff, and developed standard procedures and retention scheduling. The first records retention schedule was developed in 1965 and departments began shipping inactive records to the Records Centre (Craig, p.20). Ontario had embraced the archival records management concept.

The movement towards records management in New Brunswick took a different course. As Marion Beyea notes, while “… [records management] was born … during the buoyant Sixties, [it was] late enough to benefit from the experience of the federal government and several provinces…” and it was not hampered by the archival perspective because at that time, New Brunswick did not have a provincial archives (p.61). New Brunswick’s Public Records Act of 1929 defined records and gave the province responsibility for their preservation but not in terms of a records management programme. In 1963 the province passed the Public Documents Disposal Act which established a Documents Committee. It was this Committee that requested a Provincial Archives and a records management programme.

Interestingly, Harold Moulds (of H.M Records Services Ltd) was called on to assist with the development of a records management programme and he recommended the immediate hiring of a director of records. Fernando LeBlanc was hired to identify active and inactive records, transfer inactive records to a records centre, and destroy records not required (Beyea, p.63). Dominion Archivist W. Kaye Lamb was also consulted and he suggested a moratorium on records destruction until such time as these records were appraised by a professional archivist. Lamb made additional recommendations and Beyea credits these as laying the foundation for a Provincial Archives and a records management programme. For example, Lamb suggested that archives and records management should be “ …jointly developed, that records are handled economically, and that items of long-term value are identified, segregated and preserved” (Beyea, p. 66). He also advised the integration of archives and records management programmes which occurred in 1967 (Beyea, p.68).

However, the government did not adopt Lamb’s recommendation of building a Records Centre near the Bonar Law-Bennett building. Instead it kept the Records Centre in the Department of Public Works’ Records building and the Douglas warehouse which it quickly outgrew. When moved to temporary quarters on McLeod Avenue while the supplementary storage space in Douglas was moved to another warehouse, the Neil building in downtown Fredericton in 1971. Neither location promoted confidence with staff to use the Records Centre. Lacking space, fire safety, and the floods in 1972 and 1973, hampered the development of the records programme (Beyea, p.71). Eventually, in the late 1970s, a Records Centre was provided just outside of Fredericton which encouraged the “quality and quantity of the records programme” to expand (Beyea, p.72).

By 1967 the federal government had made considerable advances to its records management programme. In 1966 a new Public Records Order assigned control over records destruction to include all media; scheduling was made mandatory; departments required the Dominion Archivist’s authorization to destroy or remove records and the Dominion Archivist’s advice was required for any microfilming projects. Archives had complete authority over scheduling and records disposal.1966 also saw the Public Archives Records Centre (PARC) become the Records Management Branch responsible for the following areas: accessioning and reference services for inactive records; managing the regional Records Centres, and providing advisory services including scheduling and disposal, vital records, inventorying, training and publications.

 

Sources

Jay Atherton, “The Origins of the Public Archives Records Centre, 1897-1956,” Archivaria

Marian Beyea, “Records Management: The New Brunswick Case,” Archivaria

Terry Cook, “An Archival Revolution: W. Kaye Lamb and the Transformation of the Archival Profession” Archivaria

Barbara Craig, “Records Management and the Ontario Archives, 1950-1976,” Archivaria,

Dr. Mark Langemo, Winning Strategies for Successful Records Management Programmes (Information Requirements Clearinghouse: Denver, 2002) .

Laura Millar, “Discharging our Debt: The Evolution of the Total Archives Concept in English Canada,” Archivaria

Ian E. Wilson, ““The Noble Dream”: The Origins of the Public Archives of Canada,” Archivaria,

http://collectionscanada.gc.ca/massey/h5-417-e.html accessed on May 3, 2008

 

1Ian E. Wilson, Librarian and Archivist of Canada, was, in March 2008, elected as the President of the International Council of Archives, a position he will assume in July 2008.

2Laura Millar notes that the decision to copy records emerged from concern for the preservation of those records central to Canada’s history of exploration and settlement, records not necessarily found in Canada (p.106) and copying records from Britain, France, the former colonies, etc., in no way diminished their value as a copied record was equal to that of an original record