On October 22nd, 2018 I held a teleconference to discuss the topic of Information Technology (IT) Policies with data privacy expert, Michael Feldman, Esq. and Cybersecurity expert and consultant, Jeff Miller. The complete video is published on YouTube. Below, we are also providing a transcript of the entire call.
Daniel: All right, guys, welcome back for our second episode here. Today, we’re joined once again fortunately by Michael Feldman, attorney at law from the Summit, New Jersey-based law firm Olender-Feldman, and cybersecurity expert and consultant to us, Jeff Miller. Thank you once again, guys, for joining me.
Michael: Thanks for having us.
Daniel: Morning. Yeah, today is all about cybersecurity policy, IT policies for small and mid-sized companies, private companies in particular. So, Jeff, I’m just going to run right into it here. We’ve been talking a lot about policy this week here and with our customers. Why do we need cybersecurity policies, technology use policies, internet acceptable use policies? Why do we need any policy? I mean, aren’t some things better left unsaid?
Jeff: So good question there. A policy is important because it takes the mission, the vision, the risk appetite, the general beliefs of an organization and it puts them down on paper. So, if you don’t do that and don’t disseminate that information that people are going to be acting differently per department, differently per manager, and things are going to be crazy. So policy is in place for a number of reasons. It may be required due to compliance or regulatory needs. It may be put in place to guide employee behavior. Things like acceptable use like you just mentioned or mobile device management, those things are meant to guide employee behavior and let them know what’s in bounds and what’s not in bounds.
What policy isn’t is it’s not tied to a particular technology. A policy is meant to be broad sweeping in nature. And down the chain, you have things like standards, guidelines, and procedures. Policy sits at the top and is the most, again, broad sweeping. Down at the bottom of the pyramid of IT documentation, you have procedures. And those are very detailed and granular, for example, an incident response procedure would outline who does what, when they do it, and under what criteria. And so at the bottom, you have procedures, but that’s not policy. Again, the policy is meant to be an overarching broad sweeping framework to kind of map out the goals, the objectives, and those sorts of high-level things within an organization.
Daniel: Right. One thing policy is, is maybe what we expect our employees to do or what we expect them not to do and then the procedure might be, correct me if I’m wrong, for the server administrator or system administrator, what do I do in the event of a breach? is that about right?
Jeff: Right, right. Procedures are far more granular. There’s really step A, step B, step C, and then all the sub-steps in between. So they’re meant to…when your hair’s on fire, you can kind of go on autopilot indication like a cyber-security breach. You can just follow the steps. And yeah, they’re far more granular than policy would be.
Daniel: Gotcha. Michael, from the legal perspective, I suppose you’re going to tell me that ignorance is not bliss and we’re probably on better footing if we have some well thought out policies. What do you have to add on that?
Michael: It’s almost like you read my mind on that one. Yeah, you absolutely want to have policies in place. And there are many reasons and many things that policies satisfy. Your company may have contractual obligations, and that may be an obligation to your customer. It may be an obligation because you serve as a vendor, or for that matter, it may be an obligation you want to impose on your vendors. So whether it’s cybersecurity IT or information security policies, you may first have a contractual obligation to have such a policy in place. It may be an obligation to just have a written policy that somebody wants to see, I see that happening all the time, or it simply may be the fact that you have a policy in place even if it’s verbal. I’ve worked with some very small companies that have done it verbally because there are only a few people that are really dealing with issues.
Second is in many industries, you can use these policies as a marketing tool. I give the analogy into the auto industry and safety features where companies…you see Volvo doing it, you see Mercedes, you see Subaru doing it with all-wheel drive where they take safety features that may be required by law, maybe they’re optional somewhat, maybe they’re becoming industry standard, but they turn that aspect of their business into a marketing tool. And in many industries, you can do that with respect to your privacy and security practices, a policy being the first part. It’s hard to have these practices in place if you don’t have a policy first. And you’re seeing more and more businesses again use that as an affirmative marketing tool in their business.
Third, in the event you have a data breach, and I use that term broadly and we may speak about that later, but the extent you have a data breach or a technical error, you want to be able to show regulators, you want to be able to show your customers and clients, and maybe most importantly, you want to be able to show a prospective plaintiff that you took reasonable steps to prevent what I’ll call a bad act from happening. You can’t just dip your head in the sand and pretend it didn’t happen. If you do that, you’re a sitting duck. It will happen. Chances are you will be breached, you will have an error, whether it’s malicious, whether it’s accidental, and having the policy is kind of the backbone of your defense to everybody, and it could potentially save your business.
The fourth is kind of the obvious point. And this is one of the things Jeff was talking about, I believe, as well, which is to minimize the likelihood of these bad events happening in the first place, you want to have a policy in place. There’s a procedure that follows but you need the policy in place to prevent the bad thing from happening. Again, going back to the safety analogy, you need to have policies, so don’t speed, wear your seat belt, have anti-lock brakes, have all-wheel drive, snow tires in bad weather, etc. The same goes for these policies. You first need to have them in place then you can train on those policies I should say. And now you’re minimizing the likelihood that a bad event happens and that if it does happen, the damages flowing from that bad event are minimized.
Daniel: Yeah, that’s great. What comes to mind is the word underpinning. If we have good policies and procedures, we kind of have the underpinnings for everything else. We’ve done our due diligence. We have some safety nets in place. We have procedures to act appropriately and not run around like our heads are cut off. So thank you for that. So, Jeff, we’ve written tons of policies for customers. Many customers or people out there have heard of acceptable use policies, internet acceptable use policies. Give me some examples of stuff that you’ve written or policies you see. And I mean, for a small to mid-size company, is there usually one big technology policy manual? Are they broken up into different policies for different departments? For example, is there a cybersecurity versus an internet acceptable use policy, mobile phone policy? What do you usually see or what do you usually ask to produce?
Jeff: Good questions, Dan. So the smaller the business, typically, the more policy statements are wrapped into one single policy document. And the larger the company is, they tend to get broken out. So it doesn’t make sense to have 15 different, let’s say, Word documents with two or three sentences each. If you’re a real small company without a significant amount of third parties that you work with or without a significant IT environment, you can kind of fill those all into one policy and call it an IT security policy, which has policy statements for each different area. As you get bigger though, it is more common to break policies out into their own separate policies.
So some common ones I’ve seen and we helped write recently are third-party vendor management. That’s huge in the Northeast. With different regulations that have come through recently, a lot of companies don’t think about who their third-party vendors are and how they’re going to manage them and the risk that third parties play into the data that companies store. Think about Office 365. Microsoft is a third-party service provider. So how are we monitoring that service and what SLAs are in place? If their service goes down, is there a financial…do they get money back? Again, the bigger you get, you want to sort of break these policies out. We have things like, let’s say, systems and network security policy and that would be, “Hey, you have to have antivirus. Hey, it’s gotta be updated. We’re going to segment traffic and things like that” but not getting the specific implementation but just that these things need to be done. And the other thing that policies should do is also not be tied to a specific individual’s name. It shouldn’t say Dan or Michael or Jeff. It should say the IT manager, the IT director, the CIO. So titles should be used at the policy and not necessarily somebody’s name because who knows if they’re going to be there next year.
Daniel: Understood, understood. I remember a time where…it doesn’t seem like too long ago where employees of a company felt that no one can listen to my voicemail at my company or no one could read my email. That’s my personal voicemail and email. And then what you started to see more and more in the internet acceptable use policies and various policies was that the company’s taking an affirmative stance and putting employees on notice saying, “Hey, if you’re using our company-owned phone system and our company-owned computer on our company-purchased internet connection, you shouldn’t have an expectation of privacy,” and letting employees know that. Michael, is that generally the case where companies will say, “Hey, there’s no expectation of privacy with your company email and company voicemail, etc.?
Michael: Well, absolutely, and you’re phrasing almost a separate issue here, a new issue here in that there are two types of main policies we’re talking about here. One would be your internal-facing, which addresses issues that you’re talking about. It might even be the employee handbook that deals with the expectation of privacy and what you do, when you do it, whether it’s bring your own device to work policy, we can look at your cell phone, we can’t look at your cell phone, we will observe everything that you do online or we won’t, whatever the case might be, even down to we have security cameras on the premises that are going to be watching you. That sets out the expectation for your own employees and that can obviously overlap with an information security policy as well.
Then there’s the external policies that deal with your customers, your clients, your vendors, what have you, but third parties that you’re dealing with. And in that regard, generally, what I like to do is…this isn’t unique for me personally, but look at what the company is doing, what their business is, what they’re selling, what they’re presenting, the type of information or data or materials they’re handling, the sensitivity of it or lack thereof. There’s obviously a difference if you’re working with intellectual property or technology versus you’re selling lollipops that have two ingredients. The security may or may not matter. The issues may or may not matter. Once you know what the important business issues are for any particular entity, then you start to determine what you need to do to protect the business interest and that will then dictate whether it’s 1 policy, 10 policies, or the types of policies you’re dealing with. You may have departmental policies.
Certainly, oftentimes, I’ll see one policy that’s aimed at the IT crowd that needs to deal with cybersecurity and actually implementing it from a technical view and a separate policy that’s for the average employee who doesn’t want to know anything nor do they need to know anything about [inaudile 00:3:54] certification or encryption or how it’s encrypted or to what degree. But they do need to know how to use a password, what information they should and shouldn’t share, closing their computer, locking their computer, what to do with their phone, how to protect it better. So that, to me, then dictates ultimately the types of policies, which circles back to your question. The same would go for employees. It’s a two-person company, a husband and wife, they don’t really need to have a written policy on the privacy of their email. If you’re a thousand employees all over the place, you probably want a policy like that.
Daniel: Yeah. That’s good and it kind of…I’m going to skip ahead to a question I have. I think you’ve mostly answered it. Maybe you could just emphasize a little bit. A lot of people these days, they realize that they need to have something. Maybe they’re a smaller organization. Maybe they don’t have the funds to invest into a big protracted policy documentation endeavor. They go out on the internet and they find an internet acceptable use policy template and they download it and they’re done. But I think what you just said is that these policies are going to be as different as the organizations that employ them. There’s no cookie cutter policy that you kind of download and customize a little bit for your needs. I mean, it really depends on what you’re doing and what your goals are.
Michael: Well, there’s a few problems with doing what you’re talking about even though I believe a ton of businesses do exactly that, they go online and they download something. The first is a lot of clients are businesses out there. And I say clients because I’ve had clients that do this and then come to me and say, “Is this okay?” rather than go to a pay site and get some form policy that they paid $25 or $50 for, which generally is not good to begin with, they go to a competitor’s website, download their competitors politics. They say, “Well, this is a business that’s like mine. I’m going to take it. I’m going to change some name to make it my policy.” The first thing that they don’t think about is there’s copyright infringement there and they potentially open themselves up for a problem down the road and there are statutory damages, attorney’s fees. They could be buying themselves a whole problem they’re not thinking about. But even if they go the legal route and pay for one of these generic policies…you nailed it on the head in terms of it being specifically your business. You don’t start a business by just doing what someone else does regardless of whether that is for your business or not.
Daniel: Yeah, understood. There’s clearly a right way to do this. So, Jeff, over to you, one and done. We have a new company. We need a policy. We get it done. We stick it in a drawer and we’re good, right?
Jeff: No, this is why people engage us so often is because somebody had that idea a long, long time ago. What we find in a lot of the projects that we get into with policy development is there’s a binder and it’s got about an inch of dust on it and it refers to things that don’t even exist anymore. It’s talking about technology that was from 1999. It’s talking about departments that don’t exist or it’s just completely out of date. And that’s the problem is people know inherently that they need a policy, somebody eventually spends the time to write it, and then it just sits there on the shelf collecting dust. And there’s two problems to that. Number one is if it’s collecting dust, nobody’s reading it. Number two, you’re not updating it so it’s kind of useless. So the problems that your business faces today are going to be far different than the problems that your business face 5 or 10 years ago.
So we tell people it’s important to update policy at a minimum once a year, even if it’s not an update but just to review once a year for a couple hours. It’s not going to hurt anybody and you might find that you look at policy and it’s vastly different based on some new ERP system that you installed. Maybe the level of encryption is no longer effective because somebody has cracked that level of encryption, so that needs to be changed. Or maybe the corporation as a whole has a different viewpoint on their risk appetite. So because of all those things, it’s important at least annually to review policy and, where needed, to make adjustments.
Daniel: Great. And Michael, over to you on this issue. When the Attorney General’s Office gets involved when something goes sour, or the FBI, or the Inspector General, or whoever it might be comes knocking because maybe there was a breach or someone reported something, the company that has no policies at all and seemingly stuck their head in the sand or the company that has 18-month-old or 3 or 5-year-old policies, they’re probably on similar footing.
Michael: Yeah. They could definitely be on similar footing. And one thing you’re leaving out, one step there, is updating the policies is critical but the concept of you could have the most up to date policy, you do exactly what you should do and you reviewed it and you’ve updated it for technology and everything else, if you lock it away then, you have accomplished very little. The first thing or maybe the second thing that regulator’s going to do, that plaintiff’s lawyer is going to do, a party to a contract is going to do is it’s going to ask about training. And if you have the policy, you invest all the time and energy and money in developing the policy and don’t train on it, don’t teach people about it, you don’t actually see if it has flaws in practical use, you’ve really accomplished very little. You’ve probably spent a lot of time and a lot of money and accomplished very little. And you probably see by now I like using analogies and I can’t tell you why but I do. You’re coach of a football team. You develop a great playbook for offense and defense. You write it, you work with the best experts out there, then you lock it away and you go to the team and say, “All right. Let’s play.” I mean, nobody would do that. So what’s the purpose of developing?
Daniel: Yes, I love that.
Michael: [inaudible 00:23:59] plan and then lock it away. Don’t tell anybody about it. Now you’re almost worse off because you’ve spent all the money. When anyone digs into what you did…going back to your assumption at the beginning where you’re correct. If anyone digs into what you actually did to prevent that bad occurrence from happening, they’re going to initially say, “Wow, this is great.” You have this robust policy. It’s new. If it’s not been updated, you’re not doing so well, but let’s say you updated. You have this great policy but then you did nothing about it. You failed. You failed yourself. You’re more likely to get in trouble in first the place and the consequences will be greater.
Daniel: Yeah. That’s a great point because it sounds like you constantly have to make sure that people have the policies available to them, readily available to them, and digest them as frequently as possible and revisit them because most people…I remember when I was not a business owner and I was in the workforce, the first day of work you show up and usually you get handed a bunch of documents, one of which might be the HR manual, for example, that contains policy. And what do you do? You take it home, you stick it away, and you never look at it again. So I think now more than ever companies need to make sure that policy is not only updated but also readily available for consumption. Jeff, do you know of certain regulations out there that either imply or require explicitly or maybe even indirectly that an organization has to maintain updated politics?
Jeff: Yes, all of them, all the cybersecurity regulations, all the cybersecurity frameworks in all 50 states. It doesn’t matter if you’re for-profit, not-for-profit, in this vertical or that vertical, everybody has to have policies. So the 50 states comment is due to the fact that if there is a data breach, all 50 states there’s breach notification regulations at a state level. So for that, you need to have an incident response plan. So how do you know if you’ve been breached? That’s part of your incident response plan is the detection piece. Then what do you do when you’re breached? That’s also part of the incident response plan, which is hey, if it’s this kind of data, contact this entity whether it’s the FBI, whether it’s the Attorney General’s Office, your state or whether…if it’s healthcare data, you have to contact Health and Human Services and so on. All that stuff has to be in there just because of the fact that every state has a breach notification requirement.
Then HIPAA, of course, there’s a requirement to report within a certain amount of hours after knowing that you’ve been breached. There’s a requirement there, same thing with PCI. Literally, any of the regulations that exist require policy. And policy in a lot of these, like for example, the NIST cybersecurity framework, policy takes up almost a third of the entire cybersecurity framework. It’s just laying down the expectations and having it written down and disseminated and understood, and from that jumping point then you can go in and effectively make changes to the environment to secure it but they’d have to start with policy.
Daniel: Great, great. Michael, on the legal front, can you kind of bring it home for us, make it real, make it practical? Maybe give us an example or two we’re having or not having these policies is going to significantly strengthen or maybe weaken your defense footing or your position in a contractual dispute in the event that there are certain allegations or incidents perhaps involving data privacy.
Michael: Sure. I’ll give you a few examples of things that have actually occurred in the practice over the years. I’m not going to give names anywhere. One is ransomware, which is probably the fastest growing cause of damage to companies in term of hacking. Ransomware where there’s an email seemingly coming from a client, you open it, you open the attachment. Next thing you know your system starts to shut down. And we had somebody they opened an email from a client. It had what they thought was a Word document. They opened it. One by one the documents on their server started to lock up and they were told pay money or all your stuff is gone. Had there been a policy in place to not open attachments when you’re not expecting them, to contact the client if there’s any doubt, or to scan it perhaps to find out if there’s anything there, these types of incidents can be avoided. Most examples of ransomware, while the software, the bug getting in there is sophisticated, getting in the first place isn’t that sophisticated. It’s somebody who’s not being careful. And if you have policies in place that can be avoided.
Another example that seems one that probably most people wouldn’t think of, and I’ll say I didn’t really think of it until it occurred, somebody use FedEx to send a package. It had very sensitive information in there. The package included a thumb drive. There were paper documents with sensitive information and a thumb drive. The package was lost. It never reached its destination. FedEx couldn’t say what happened. It was gone. That’s a data breach. The way to prevent that is you can have a policy. For example, if you’re dealing with sensitive information and you’re sending it, put it all on a thumb drive or some sort of discs that you can encrypt. So if it’s lost, you now don’t have a data breach. You don’t have to deal with notifying potentially thousands, tens of thousands of people and pay for credit monitoring, whatever their remedy might be, again, having a policy in place.
Daniel: Makes a lot of sense. I mean, that really does make it real because I was just taking some notes as you were talking here and I’m thinking if the company has a policy for all new employees or ongoing policy that cybersecurity training is something that we do in our organization and you are required as an employee to make sure that you get that training which is readily available. Once I get that training, I may never send that removable drive via FedEx because maybe we have a policy. Also on top of that training, another layer of the onion, which says that we’ve trained you to be aware of cybersecurity so sending things through the mail or FedEx is probably not a good idea. Company ABC, we are not allowed to ever send any type of computer media through the mail or any other delivery system.” So I hear put in the safety nets, and that’s what the policy is going to do. And then maybe another layer that says, “Everything that we put on removable media must be encrypted, ” or, “In company ABC, we are not allowed to use USB drives or removable media.” So, I love the fact that we’re tying that all back to policy and education. And for the education piece, maybe the policy is, as a mandatory practice, you have to take this cybersecurity awareness training. So that really does pull it all together. So that was great.
Jeff: You actually bring up the point that brings it back to the beginning as well about customization or pulling something off line. The things you just talked about and that I just talked about may or may not be right for your company depending on what your company does. So if you read a policy that says, “Don’t send anything by FedEx and encrypt everything,” that may be great for somebody. For another company, that may be onerous and make no sense and you might be dealing with people who can’t implement it. So if you don’t first consider your company, your asset, what you have, what you’re trying to accomplish, you can’t then have a policy that actually fits you. And having a policy, for example, that says, “Encrypt everything, don’t send it by FedEx,” or, “Don’t send any electronic media except by, I don’t know, hand delivery or something by an employee,” if you have that because it sounds good and you don’t follow it and something goes wrong, again, you’re shooting yourself in the foot.
Daniel: Right. No, I love it. I think that does pull it all together. We’re running out of time. Michael, I’m going to give you the last word here. One thing you said to me last time we spoke and it really stuck out to me, and I’m kind of paraphrasing here, you said, “You will be breached. You will be hacked.” It’s a bold statement. Expound on that position for us. What do you mean? “You will be hacked. It’s going to happen.”
Michael: Well. If you look at the evidence, it’s happening, whether it’s government bad actor. I’m not going to name countries but you look in the paper every day, there are government entities that are hacking. They might not be doing anything bad now. They might be doing it for national security or it might be corporate espionage with a government behind it. You have private hackers who are doing it for profit, some selling information on the dark web, some espionage. Some are just kids who are having fun seeing what they can do. If you actually look at the numbers and look at your own system, you’ll probably see that people are trying to get into it. They may not be sophisticated. They may not get in but they’re trying and it is happening. You look in the news, the biggest companies are getting hacked, and it’s not just one of them. It’s not just Sony. I mean, everyone out there is getting hacked. And if the big companies that probably are spending the most money and having the most policies and the biggest security budget are getting hacked, it’s sure going to happen to the smaller ones. So that brings back, well, what are you going to do about it?
And what you’re going to do about it is you want to minimize, number one, that it’s going to happen. Number two, you want to minimize what information can be obtained if it does happen. That’s why when you hear about government servers being hacked, it’s generally not the most secure information because they have their data compartmentalized. So if you get into one part, you’re not getting in everywhere. And you want to minimize the consequences to your business if that happens, or let’s just say when that happens, whether it’s public or not. That’s what I mean by “if.” It might not be public. You might not even know if it’s a government actor. But you want to minimize the consequences and that includes everything we’ve spoken about today so far in terms of training, having a policy, as Jeff mentioned, having an incident response plan. You need to know what to do. If you don’t know the laws, you may have an obligation to report to either the appropriate authorities or to the data subjects whose information was hacked. The failure of doing could be huge consequences. I think Facebook’s on the verge of facing that now with their most recent breach in GDPR and failure to notify data subjects of the hack. The consequences can be tremendous. So it’s going to happen. What are you going to do about it?
Daniel: Yeah, understood. We say the same thing in terms of data backup and disaster recovery practice here is that it is not “if,” it is definitely “when.” We’re at a time for this one, but I thank you guys once again for joining me. This was great. Until next time, hope you guys have a great day, and we’ll see you soon. Take care.
Jeff: Thank you.