Bringing Light to the Dark Data Abyss: The Science of Data Discovery
By Bill Millican, Xact Data Discovery
Managing discovery can be a tricky business. Some companies get into trouble because they understand so little about it that they simply do not know where to begin. Others think they understand everything there is to know and attempt to manage it all themselves.
Understanding the data you have and where it resides is important for any company, but it is also important to know when to call in the experts. There is a scientific approach to collecting data for discovery, and the courts know this, which is why they prefer you do not attempt collection on your own.
The Sedona Conference® (TSC), founded in 1997 by Richard G. Braman, consists of an experienced team of litigating attorneys. The group gathers regularly in Sedona, Ariz. and other venues to continue its work on the growing issues and difficulties of discovering and producing organizational data stored electronically.
In 2003, TSC published its first “Observations,” and then its first “draft” principles, which eventually led to The Sedona Guidelines and Principles. The Principles were followed by—and to some extent, inspired—amended Federal Rule of Civil Procedure 26, which has in turn generated a bevy of new and relevant court mandates that assist and direct attorneys as they go about the discovery process in preparation for trial.
While it’s tempting to add “the rest is history,” that would be grossly inaccurate: these guidelines did not close the book on discovery struggles.
In fact, the struggles persist as data continually expands with the ongoing development of technology, and the growth rate is alarming. There are predictions that data will grow by 19 times by the year 2020; and as businesses deal more regularly in virtual settings, the task of keeping up with and effectively managing organizational data will become increasingly daunting as the data abyss deepens and darkens.
These challenges were exemplified in a search on a recent electronic discovery project that produced approximately 29,000,000 hits. The task was to determine which ones were most relevant to the company and for this particular eDiscovery initiative.
Deciding which of the roughly 29,000,000 hits should be reviewed was absurdly difficult and possibly futile; and yet, organizations often attempt comparable feats. The outcome of making poor selections can be devastating.
Caught Between Good Intentions, Risks
Committing to a strategy for electronic data management is not for the faint of heart; the choice is fraught with risk. Organizations are guided by a desire to do what is best for their clients or customers, to do what the courts may require, or simply to do what is right.
The most difficult challenge facing records and information managers and their colleagues regarding eDiscovery is that this task is only going to get more difficult with the explosion of electronically stored information (ESI).
Need for Expert Assistance
The courts have become incredibly savvy about the issues surrounding eDiscovery. They understand how difficult it is, but they are also aware of the advanced tools and expertise available to the parties that appear before them.
The courts expect organizations that lack the expertise to conduct reasonable eDiscovery on their own to identify and engage these experts and use these tools. There is no excuse not to enlist those who have spent years building specialized expertise in negotiating the hidden pitfalls that lurk within the dark data abyss.
Organizations resist adopting appropriate eDiscovery processes because of lingering perceptions that:
- It costs too much;
- It is too time consuming;
- It takes too many human resources;
- They lack the right kind of equipment;
- They lack the right kind of software; or
- They’re willing to roll the dice and see if they get caught.
The courts have heard the excuses. Registering them will not alter a court’s position in any meaningful way.
If organizations are willing to create, receive, or in some other fashion accumulate and amass data, then it is critical that they have a strategy for properly and ethically managing that data.
The question that all organizations should be asking is what does our data management strategy look like and how efficient and effective is it? After all, the courts hold that ESI and data belong to an organization, and the organization should bear the cost of knowing what data is where and who has governance over the data.
A Common Scenario
For example, nne day, executive management makes the landmark decision to allocate appropriate resources to develop, deploy, and properly oversee a data management strategy.
There is no turning back, no wavering, no second guessing. The organization must now begin to find its way out of the abyss of data before the oxygen has been depleted, and there is not even the slightest glimmer of light.
It sounds dramatic, but at the start of a project of this magnitude, this is often how everyone, from executives to staffers, feels.
The initiative and all the related decisions loom large. The project involves senior management from every corner of the organization: information technology, risk management, business continuity, records and information governance, finance, human resources, revenue generation, and facilities. After hours and days of research, discussion, and sometimes quite heated debate, two option emerge:
One—Build a solution using company resources; or
Two—Look for external help.
The question then becomes, “How to choose?”
It should be a no-brainer, but often it is not. Many organizations simply put the cart before the horse by choosing the all-too-common approach of collecting data before a data management strategy has been developed.
Eagerness is a factor, but this decision is frequently driven by ignorance. Data collected without proper planning is often meaningless, and collecting it just creates more headaches; data must be properly identified, organized, and filtered before it can be effectively collected.
Science, or the scientific method, is a body of techniques for investigating phenomena, acquiring new knowledge, or correcting and integrating previous knowledge.
According to Wikipedia, to be termed scientific, a method of inquiry must be based on empirical and measurable evidence subject to specific principles of reasoning. This description pertains perfectly to data management and collection. The scientific approach should be applied to how data is discovered, located, identified, collected, and classified.
How can a corporate culture that has been reactionary rather than proactive regarding data be changed? Viewing data discovery as a science is an important first step.
A comprehensive data management policy can address many of the rigid and unyielding factors that complicate the process:
- Time. The hard rap of opportunity will weaken over time.
- Methodology. Happenstance and good fortune are always welcome, but quite unlikely elements in this process.
- Technique. Effective, efficient and persistent data discovery is strategic. It is not flying by the seat of your pants.
- Flexibility. A constant, never-ending commitment to improved management of always changing data is required. It is never quite dormant, and its growth is simply unpredictable.
- Audit and Assessment. Daily routines must be constantly evaluated for accuracy.
- Measurement. Good data management requires solid metrics: how much, how much where, how much when and by whom or by what; where did it come from, where did it go, who created it, who authorized it, who used it, who took it, who shipped it, who destroyed it.
When organizations get serious about managing the risk posed by the massive amount of data involved in doing business, they will set about defining and implementing a comprehensive data management policy. This policy will include the proper solutions for data discovery and, in most cases, a partnership with a provider of data discovery expertise.
Gloom and doom have traditionally dominated discussions about eDiscovery, and that is not unwarranted. Tackling an eDiscovery project can truly feel like staring into an abyss given the unfathomable amounts of data being created in today’s world. But, never fear: making the decision to take a proactive and scientific approach to data is the first step to bringing light to that abyss.
Bill Millican is a veteran and respected expert in the field of information and records management. He has nearly 40 years of experience in various hands-on roles and consulting positions, including having served as director of IT and standards for ARMA International. He currently serves as director of sales and operations for Xact Data Discovery in Kansas City, Mo.
©2013 Bloomberg Finance L.P. All rights reserved. Bloomberg Law Reports ® is a registered trademark and service mark of Bloomberg Finance L.P.
This document and any discussions set forth herein are for informational purposes only, and should not be construed as legal advice, which has to be addressed to particular facts and circumstances involved in any given situation. Review or use of the document and any discussions does not create an attorney-client relationship with the author or publisher. To the extent that this document may contain suggested provisions, they will require modification to suit a particular transaction, jurisdiction or situation. Please consult with an attorney with the appropriate level of experience if you have any questions. Any tax information contained in the document or discussions is not intended to be used, and cannot be used, for purposes of avoiding penalties imposed under the United States Internal Revenue Code. Any opinions expressed are those of the author. Bloomberg Finance L.P. and its affiliated entities do not take responsibility for the content in this document or discussions and do not make any representation or warranty as to their completeness or accuracy.