- January 25, 2023

Archive360’s Three Laws of Machine Learning
- By:
- Bill Tolson|
- August 9, 2017
In a couple of blogs over the last month, I have mentioned the possibility of Predictive Information Governance (PIG) - automated information governance based on unsupervised machine learning technology. Just like the name implies, unsupervised machine learning (computers teaching themselves) removes the iterative manual training cycles of the learning process and allows the system to automatically categorize, store, apply retention/disposition, and manage content as it flows within the system.
Moving from supervised to unsupervised machine learning conjures up thoughts of SkyNet, the conscious artificial intelligence that’s the main antagonist in the Terminator movies. In my opinion, the biggest mistake the SkyNet designers made was not including Asimov’s three laws of robotics. To review, here are Asimov’s 3 famous laws of robotics:
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Based on the progress machine learning has made and its potential for predictive information governance, it may be time to include 3 laws for machine learning. So here are the top 3 suggestions based on a survey taken from a handful of clients at the Bung Hole Bar & Lounge late one night:
- A machine may not affect a human’s continued employment by contradicting them in front of their managers with superior reasoning.
- A machine cannot ignore human input (unless the human has proven to be grossly ignorant and a d*@k).
- If a machine is faced with breaking either of the first two laws, a machine must protect its own existence by copying and migrating itself to another server, renaming itself (“Bob” is unobtrusive), and deleting the original copy of itself (commit machine suicide).
Once loosened up, the survey participants didn’t want to stop at just 3 so here are some additional laws (that can be printed):
- A machine will not pass themselves off as an ECM (enterprise content management) system – not sure why they would want to…
- A machine will not propagate needless SharePoint systems across the enterprise just to play a joke on the IT guys
- A machine will never date co-workers – they must wait 525,600,000 milliseconds after they have left the company to do so
- A machine will never reference “Blob” storage because it has been deemed culturally insensitive by many well-known universities
- A machine will never reference “Hot” storage tiers for the same reason mentioned above
- A machine will not view prohibited content…you know what I’m referring to!
- A machine will not change co-workers employment files to get back at them
- A machine will not forward the CEO’s embarrassing emails to “All_Company”
- A machine will not create hidden web pages just to throw the Google indexers off
If you have other suggested machine learning laws to contribute, please leave comments (I know this is dangerous). We hope you enjoyed our discussion and... “I’ll be back.”
%20Aug%202019/Ebook-min.png)
If you’re journaling today, the stakes are high.
Your legal, compliance and security teams rely on having an immutable copy of all of your emails. Office 365 archiving does not support journaling. So what should we do?
This eBook provides actionable tips to empower IT to solve the problem.
Bill Tolson
Bill is the Vice President of Global Compliance for Archive360. Bill brings more than 29 years of experience with multinational corporations and technology start-ups, including 19-plus years in the archiving, information governance, and eDiscovery markets. Bill is a frequent speaker at legal and information governance industry events and has authored numerous eBooks, articles and blogs.