Hardly a day has gone by recently without some mention in the mainstream news of a breach of an organization’s data, or some new threat to our personal information ‘worming’ its way through our social networks and lives. It should come as no surprise then that so many people speak in terms of “going off the grid.” I’m routinely amused, and despite managing privacy and security for my organization, also a bit alarmed when I get my quarterly email from Google showing everywhere I’ve been in the last 90 days. So, I appreciate the concerns people may have who do not have any expertise in privacy and security.
No company can really operate today without declaring their commitment to maintain the privacy and security of your data, or that which they collect on behalf of their customers, and vendors. That said, there’s a significant difference between making a statement in a contract or on a website “committing” to keep data secure and private, and actually making the investments—both time and money—to proactively ensure both the privacy and security of the data.
As the Chief Privacy and Security Officer of a data driven organization, I can tell you the investment is not to be minimized. Privacy and security are separate but related things. According to the International Association of Privacy Professionals (IAPP)1 , a leader in the privacy and security community:
“Data privacy is focused on the use and governance of personal data—things like putting policies in place to ensure that consumers’ personal information is being collected, shared and used in appropriate ways. Security focuses more on protecting data from malicious attacks and the exploitation of stolen data for profit. While security is necessary for protecting data, it’s not sufficient for addressing privacy.”
Security is not simply a matter of licensing some anti-spam and firewall software and implementing them. Yes, these are required and there’s a cost to implementing them which can be significant. And when one considers all the other components such as encryption tools, routers, etc. that are nominally required to meet a threshold of “secure”, the investment can add-up. Thank goodness hosting and managed service providers are now really good at efficiently architecting and managing these tools for their clients.
Privacy by Design
Even with these technologies, the concept of “Privacy by Design” comes into play; was the entire solution architected from the ground-up with privacy and security as important as the key functionality and features that the solution aims to deliver, or is it an after-thought?
It’s rare that an entrepreneur when building a prototype or “minimally viable product” (MVP) has incorporated an architecture that aims to ultimately secure and protect the data; that takes more time and money. But even if they had done so, security is only a part of the equation; privacy being the other.
Privacy, as the IAPP definition suggests, is more about how the organization behaves; the procedures and policies it has implemented and adheres to. An example of where these intersect might be with logging-in to an application or website. The ‘security’ aspects would have the solution require an ID and password to gain access. But if the organization were to go so far as to embrace a privacy by design framework, they might build encryption into the password mechanism, and multi-factor authentication as a process whereby users had to do things like enter a code sent to their email or phone to complete the registration process, simply to verify who they are. One becoming more common is identifying which images have bicycles (for example) in them is becoming more and more common. But this adds more complexity to the application design as well as the architecture and takes more time and money to build and deploy. It’s not typically done in the incubation or MVP states of development.
So, one might ask if the organization has re-architected their MVP solution to take advantage of these “best practice” approaches to their security.
Let’s assume they’ve made that investment despite the expense. If the staff or clients then share passwords, and neither have a routine and appropriately frequent audit process to ensure who is accessing the solution, all that security could be for naught. It’s akin to saying your house is secure because you have a lock on the door; it doesn’t mean much if it’s never locked or so many copies of the keys have been made, you don’t know where they are anymore.
Standards and Independent Review Matter
While the concept of an audit process is simple enough, implementing one and managing it is not. Audits need to be done routinely and add overhead costs. This is where privacy and security credentials and legislation like SOC2, ISO, HIPAA, and FERPA matter. A company claiming “We are FERPA compliant” or “We adhere to SOC2 requirements” is not the same as having been independently audited and successfully demonstrating that not only do they have the technology and policies in place, but that they are using them appropriately and following procedures; something that has been verified by an external auditor.
In fact, with most of these credentialing processes, there are levels or stages, the first is simply having the technology and policies and procedures in place. This is what a SOC2 – Type 1 audit entails, for example. That said, having the policies and procedures in a folder in a drawer is a far cry from following those daily. Can your vendor demonstrate they are doing what the policies and procedures say they will, and that they’re managing and monitoring the technology to a level that satisfies an external auditor, year over year? If not, how committed are they really to making sure they are doing everything possible to protect your data?
If protection of your data is important, take time not just to ask the simple question of your vendor if they have privacy and security in their solution, but whether they have been independently audited and can you see the findings of that audit. It’s the difference between who really cares about privacy and security, and who doesn’t.
— Eric Gombrich has over 30 years of experience in the healthcare information technology and medical device industries, having led organizations in their expansion efforts across the US, Canada and around the world. As the Chief Revenue Officer and Chief Privacy and Security Officer of Tickit health, he ensures that the organization’s data and its customers’ data is secure and private.