Need to know vs. need to hide – a conflict you will sooner or later stumble across when thinking about knowledge management. But can this conflict be solved? Well, I would say: yes.
Basically all IT security frameworks, coming from a risk management perspective, strongly recommend or even oblige you to have a proper IAM (identity & access management). Be it international ones like ISO 270xx and 310xx, or national ones like BAIT/MaRisk and BSI mGS in Germany, all of them tell you that you have to manage permissions for accessing systems and data. The problem is, they don’t really tell you how. Or, to be more precise, how to organize that job in the first place, except maybe to put the responsibility into the asset owner’s hands. Example: In ISO 27001-2013 A.9.1.1 it says “An access control policy shall be established, documented and reviewed based on business and information security requirements”. A.9.2.5 adds “Asset owners shall review users’ access rights at regular intervals.”, with A.9.4.1 stating “Access to information and application system functions shall be restricted in accordance with the access control policy.”.
The only thing all frameworks have in common is the requirement of giving only the least possible level of access, i.e. providing only that much access that is absolutely necessary for each individual to do his job. But how do you know if someone else might need exactly that bit of information you are currently working on?
(Un)necessary permissions – who decides?
Okay, it’s the asset owners that should decide who will get access to what. But how can they decide? That’s where the access control policy comes into the game. The classical approach is to structure information stores, like file systems, following the organisation’s structure, the process map, by projects or whatever sorting criteria seems appropriate. The next step then is to restrict access to the user’s department, the business process he supports, the projects he works for, and so on. As a consequence, you will end up with a huge number of permissions: the more granular you design the structure, the more permissions you get. All this has to be managed, which alone can be quite a challenge, especially if it has to be done manually and there is no IAM system in place. In addition, you might run into technical problems: when using Active Directory as the basis for access control through AD groups, which are then synchronised into other systems, you might encounter problems like limitations in the number of groups a user can be member of (Kerberos token size, LDAP sync, …). And the larger the structures get, the smaller the portion of information gets for each individual to see.
If people can’t share it, they’ll duplicate it
Experience shows that users always take the easiest and fastest way to reach their goal. Imagine the following situation:
Someone from a business department, let’s call him Tom Tender, is working on a tender as part of a small project. There is a small team, but no dedicated project organisation in terms of a project associated storage has been established. As a consequence, Tom puts all the project related files into a folder within his department file share, and gives them speaking names like “Tender_IT_2021_v0.8_draft.docx”. Now he needs some advice from a colleague, Jim Judge, working for a different department, who naturally does not have access to Toms file share. As time matters, Tom sends the document in question to Jim by e-mail. To keep track of the versions, Tom names the file “Tender_IT_2021_v0.8_draft_Jim.docx”. Jim works on the document, creates a new version as there are some significant changes, and sends it back as ” “Tender_IT_2021_v0.9_draft_Jim.docx”. In the meantime, Tom noticed that his colleague Ruth Rater also needed to be consulted. Okay, no problem, “Tender_IT_2021_v0.8_draft.docx” was quickly sent to Ruth via Teams, because she was on a business trip and only had access to Teams on her mobile. Ruth made some amendments and sent it back to Tom, keeping the name of the file. Somehow Tom then manages to merge the three versions into one which he then distributes to the project team, and in addition, to Ruth and Jim – by e-mail, logically, named “Tender_IT_2021_v1.0_final.docx”. Ah, and not to forget, to Betty Buyer, for further processing. Guess what? Betty also needs to make some small but important corrections, so version “Tender_IT_2021_v1.0_final_updated.docx” is quickly created and sent back, thanks to Outlook’s “reply all” feature, and is also distributed to some external business partners.
As a result, there are not only numerous different versions of the file stored in different places, but also lots of copies of the different versions. Imagine Tom and Betty use the same distribution list of about ten people, and the original file had a size of 2MB. Just by sending it back and forth the way they did, due to the encoding far more than 40MB of mail data were created, a factor of more than 20. Plus the “Sent items” in their own mailboxes. In a real life environment it is usually not only one file that is handled this way, but multiple files, and very often these are quite large in size. Unless you have a clever de-duplication solution in place, you have to buy, run and back up unnecessary amounts of storage. Just give it a try and run a duplicate detection script over your filers – you’ll be surprised how many matches it finds!
A final version is not necessarily final
Back to Tom, who had left for holidays immediately after sending the file to Betty, and as all other recipients of Betty’s mail where only in CC, no one had cared about putting the final file back into the file system alongside the “original” file. A week later, Tom’s boss wants to know the status of the project and checks all related files in the department share. He finds the “…v1.0_final” version and is happy, because his friend from a partner company also wanted to participate in the tender, and so he gets the file sent over by Tom’s boss. And, as what can go wrong will go wrong, that company nearly wins the tender – based on an offer that was calculated on an outdated document version. I leave it up to your imagination which consequences arise might from this…
Shared work is half work (well, kind of)
Coming back to Tom and his tender. The version conflict was detected due to a request sent by the partner company to Betty, the tender was correctly evaluated, a bidder was selected and an order was placed. A couple of months later, Tom’s neighbouring department starts a project similar to Toms. Charly Chaser has been given the project manager role, and he is a bit more clever than Tom: he requests a project share from the file service administration, and grants write permissions to all team members and other project shareholders. So far, so good – file duplication problem solved, at least internally. But now the next problem arises for Charly. He never ran a project of that kind, not to mention a tender. So he starts investigating the different topics, creates a lot of files, and after some weeks his project has reached a good grade of maturity. Then, just by chance, he meets Tom at the coffee machine and tells him about his project. Tom replies that it was actually him who ran a similar project a couple of months ago, and offers Charly to share all project related information with him. So Tom gets access to Charlies project and copies all files over, because his team still needs them, and he has to keep a copy for them. Charly, although angry because most of his work had been redundant, uses the chance to do some QC’ing on his project based on Toms data.
Sharing is one thing, knowing the other
If you scale the little example above to enterprise grade companies, you will find that a huge amount of work, storage space and thus money could be saved if all information and knowledge would be available and accessible for all users. There is no doubt that many companies already went for modern collaboration and file sharing solutions, based on O365 or other platforms. But there is also quite a number of companies who run their old file servers, NetApps or whatever solutions have grown over the years, and who have restrictive access control mechanisms in place. Restrictive means, following the classically interpreted “need to know” principle, instead of a modern, open “need to hide” principle.
All access control policies start with an assessment of protection requirements from a risk point of view, regardless of the standard you follow. I am convinced that the approach should always be to protect only those information items, where a disclosure to the different stakeholders can do harm to the company, and to simply declare everything else as “company public”. Every employee and contractor signs a non-disclosure agreement anyway, so which sense does it make to hide things which might be useful for a certain user’s work, even if this is not obvious at the time of the assessment? You just have to make sure that what is secret stays secret, i.e. mechanisms must be place that restrict users from sharing things with others, who for good reasons must not have access, like personnel data. This can in most cases be reached be technical measures, like removing the permission to change object permissions. If no technical solution is possible, a company agreement must cover an organisational solution.
Personal or company data?
A real life example: for making things easier for the IT support, we needed a way of automatically identifying each user’s default PC. The solution was to have the inventory system collect the user login count and timestamps from every machine, and in the ITSM database set this in relation to all logins together with some simple logarithm based mathematics, ending up with a rating between 0% and 100% for each machine a user ever logged on to. The machine with the highest rating was then picked by the ITSM tool as the most probable default machine. The necessary agreement with the works council was quite simple: collect all data as required, store it accessible only for admins, guarantee to not do any reporting on the raw data, and only load the resulting ratings into the ITSM tool where they are accessible for the IT support people.
The example shows that the boundaries between company and personal data are not always easy to identify. The basic, login data is clearly personal and must be protected, but the derived ratings are – in my opinion – uncritical.
Information classification is the key
When preparing the assessment, you have to get a clear and common understanding with classes of information you have and which should not be publicly accessible. This varies from country to country, the most common examples are personal data and, especially in the EU, everything that could be used for behavioural and performance monitoring. Other examples are financial data like revenue reports before their publication, or product recipes.
After you have identified the information that needs a higher protection level during your assessment, everything else (which should be the majority) can be declared as public. And even if you do not have a highly sophisticated document management system and an enterprise search engine in place, a clearly structured file storage with server aliases and non-cryptic folder names can be a good starting point.
Conclusion: Data – Information – Knowledge – Value
When following the radical openness approach described above, you are on a good way to get the most value out of you data. Tools and systems will help you on your way, but most important is the right mindset and the support by the top management as well as the data protection officer, the information security officer and, where applicable, the works council.
If we only knew what we know… we could save lots of work, time and money, and be ahead of our competitors!
Downloads & further reading
- Ready-to-use script for finding duplicate files:
Find Duplicate Files Script - Original script library source (functions used in above script):
https://powershell.one/tricks/filesystem/finding-duplicate-files - Supervisory Requirements for IT in Financial Institutions (Bankaufsichtliche Anforderungen an die IT – BAIT)
https://www.bafin.de/SharedDocs/Downloads/EN/Rundschreiben/dl_rs_1710_ba_BAIT_en.html - Minimum Requirements for Risk Management (Mindestanforderungen an das Risikomanagement – MaRisk)
https://www.bafin.de/SharedDocs/Veroeffentlichungen/EN/Meldung/2018/meldung_181015_veroeffentlichung_marisk_englisch_en.html - The chained library in Wimborne Minster, Dorset, UK (featured image):
http://www.wimborneminster.org.uk/110/chained-library.html
The ISO standards are not available as a free download.