Thursday, March 21, 2019

Responsiveness vs. reactivity.

What do you do when you encounter something new, something hard?  In our world, we are bound to run into things we don't know how to unpack or process or deal with.  None of our current tools seem to work.  The easiest thing to do is to react: get busy and attack the problem with tools and strategies that, although familiar, clearly aren't suited to solve the issue at hand.

Even though you know a fork isn't the right utensil for eating soup, rather than pause and search for a spoon, you try to compensate by working harder.  You scoop up tiny forkfuls of soup as fast as you can so everyone can see you're working as hard as you can to empty the bowl. 

But you aren't doing your best - you're just wasting a lot of time and energy in a very unproductive way.  

Stop.  Reflect on the situation for a moment.  Ask for help.  Ask for advice. Search the web for "best utensil to eat soup".  Take a few minutes and go find a spoon.  Buy one if you have to.

It takes discipline and confidence to stop reacting, to take a moment in order to determine the most effective way to respond to a pressing issue.  You need a good answer for the boss if she sees you sitting around thinking while that expensive soup is getting cold.  She needs to explain to her boss why nobody's doing anything about this problem.  Tell her the soup will get cold by the time you're done trying to eat it with a fork, anyway.  And if you don't take the time to get the spoon this time, then next time, you won't have the right utensil to do a better job emptying the soup bowl, either.  If you can convince the bows to let you invest the time into finding a better solution to the problem this time, you'll save everyone a ton of time, money, and frustration down the road. 

Respond, don't react.

"What does eating soup with a fork have to do with cybersecurity?", you ask.  Everything.  

Tuesday, March 19, 2019

Intelligence Test

If you were a nation-state, how would you test a rival state's intelligence system?  What if you fed them fake information, and then sat back and observed how quickly they reacted to it?  You could measure your own effectiveness at disinformation at the same time you measured their response time.  You'd also begin to understand how they react to different stimuli. Simply by forcing your adversary to react to non-existent issues would throw them off balance and create general malaise.

What if you made them question their tools? Got them to throw away perfectly good - maybe even best-of-breed - systems just because you were able to convince them they were no good, or had a bug?

Now, instead of moving forward, your rival is tied up replacing resources that work just fine - at a great cost in labor, cost, and time. All for nothing. They're operating at diminished capacity during the replacement, and may replaced something effective with something not-so-effective. And you've figured out what buttons to push to make them react, at virtually no cost.

Am I the only person who thinks this is a pretty efficient way to test an opponent's capabilities?

Friday, March 8, 2019

Are New Risks More Risky?

Technology is in a constant state of flux and transformation, and this trend drives and accelerates the depth and pace of change even more.  Tools and solutions - even programming languages - seem to obsolesce before they are even fully understood, much less deployed and secured. 
This pace can be overwhelming to cybersecurity practitioners, and I think we sometimes cite risk as a defense mechanism against change.
Are new solutions riskier?  If so, is it because they actually carry more inherent risk, or because we don't understand how they work?   Is something that we haven't used before less secure than something we have worked with before?  This shock of the new may be behind our perception of the risk of embracing emerging tech.  Security folks don't like disruption, because disruption, like pure chaos, is hard to model, and much of our practice relies on modeling and predictability.  Cutting edge technology shifts the axis, forcing us to rethink our security formulas.
Some of this discomfort is understandable, but consider that not too long ago, it was considered unthinkable that anyone would use her credit card to buy something on the Internet - too risky.  Amazon, ebay, and PayPal embraced that risk - and made billions of dollars - precisely because everyone else was afraid to jump into a risky business.  It was madness to put sensitive data on a LAN, because it couldn't be secured.  Now, a lot of our data is moving to the cloud on the open web.  Is it riskier?
I think the risks are just different, but the biggest risk is standing still.  Would it have been risky for Sears or K-Mart to move their businesses online? You bet it would have been!  But it turns out doing nothing was even more risky.  Instead of losing some data, they lost everything.
Technology is advancing - and disrupting - at an ever-accelerating pace.  Jump on the train, or risk getting left at the station.

Thursday, March 7, 2019

Serving the People You Serve

How well are you serving the people you serve?


In a recent Akimbo podcast, Seth Godin asks four core marketing questions that I think apply to cybersecurity.

1. Who do we serve?
 I believe that in cybersecurity, we serve three different masters:
"The Customer" - is ultimately who we are trying to protect.  She funds our business.
"The Business" - signs our paychecks. Everything they do is supposedly for the customer. 
"The Regulators" - keep us honest.  They set the standards we have to apply to the technology The Business uses to serve The Customer.
We have a relationship with all three of these masters, and we have to earn their trust and stay in their good graces in order to be effective. 

2. What do the people we serve need?
It isn't your father's IT anymore.  Technology is moving to an "as a service" model.  Microservices, chatbots, automation, The Cloud, mobile, wireless, collaboration.  This raises a lot of red flags to those of us who grew up defending systems with clearly defined boundaries.  The Business demands innovative solutions because The Customer demands it.  And the competition is scratching their itch.  And new competitors, new disruptions, are cropping up every day. The people we serve want to be served in a different way, and if we don't do it, someone else will. We have to adapt in order to survive.  And we have to do it in a secure, accountable manner.

3. What do we own?
What assets do we already have?  How can we adapt them to serve our people's new needs?  Can we retool our existing systems? More important, can we reimagine the way we use our monitoring, our firewalls, our intrusion detection, our access and identity management, our logs?  Are there new security capabilities in the very disruptive technology that The Business wants to use to serve The Customer?  How can we leverage what we already have to scratch the new itch in a secure way?

4. What do we know?
How can we tweak what we already know to adapt to our people's new needs?  How do we apply our knowledge of defense in depth, of best practices, of monitoring and incident response to maintain our relevance, and more important, our customers' trust?

Our customers' needs, the technology to provide services, as well as our requirements, are changing faster than ever.  How well are you serving the people you serve?


Wednesday, March 6, 2019

Flywheels and Bullets

Why are high-performers so much better at executing?  They seem to be able to take on new work and initiatives almost effortlessly and even gain momentum in the process. How do they do it?

There is never a single decision or action that will propel you to excellence.  Success is incremental. Jim Collins, author of Good to Great, writes of what he calls the Flywheel Effect.  Rather than thinking of work as a series of steps, think of it as a well-placed nudge to a wheel that is already in motion.  If you exert the right degree of force at the right inflection point, you will increase the momentum.  It will feel inevitable: if you do A, you almost can't avoid doing B, and C just follows naturally, and so on.  This builds organic momentum.  

Now the problem is discovering what actions propel the flywheel.  Experiment.  Fail as much as you have to, but fail small.  Fail early.  Fail when it's cheap and easy to clean up any mess you make.  Most important, fail while you still have momentum so you can course correct and find the right way to build momentum.   The key is to fail until you find the sweet spot, the synergy. Then you stoke it patiently over a long period of time and you will see that your results begin to amplify.  You will also begin to discern an underlying logic that you can extrapolate into other categories of work. The flywheel principle explains why the Lean principle of failing fast is such a key element.

Another of Collins' dictums is to fire bullets first, then fire a single cannonball.  The idea here is that you can significantly reduce risk and maximize return by starting.  Fire as many bullets as you need to in order to get your range and windage.  Then when your target is within perfect range, and you are hitting bulls-eyes with the small bullets, bring out the big gun, and knock out the target.  In other words: baby step, baby step,  baby step, baby step, baby step, baby step, giant leap,  baby step,  etc.

Slow and steady may win the game, but taking well-timed, calculated risks can provide exponential returns.

Tuesday, March 5, 2019

Stop Counting and Start Slicing!

You say you don't have enough data to start.  I say you aren't slicing your data enough.

Fixing things is hard.  It takes time and you never know what you should fix first.  Everything seems important.  You have to make distinctions between the various things that are broken in order to set priorities, and the most defensible and repeatable way to do that is to measure them.

Here are some thoughts on measurement.  I highly recommend Doug Hubbard's brilliant "How to Measure Anything", which inspired a lot of my thinking on this topic.  Gathering data is time-consuming and expensive, but you probably don't need as much of it as you think. 

For example, if you take only five samples, there will be a 93.5% chance that the median number will fall within the range of those five values.  No, you won't see the outliers or black swans, but you'll know quite a bit about your problem very quickly.  You can spot trends even more quickly with the Urn of Mystery rule.  Imagine you have a jar filled with two colors of balls.  If you randomly pull a single ball, there is a 75% chance that the majority of the balls are that color.  So quit saying you don't have enough data to know where to start.

But wait, you say, my shop is complex - we have lots of different-colored balls, not just two. Of course you do, and the solution is to slice the data thinner.  You aren't trying to fix everything at once, so why would you measure it all at once? 

Infopedian
[CC BY-SA 3.0 (https://creativecommons.org/licenses/by-sa/3.0)]
Pull out all your data and sort it.  Sort it by issue, by number of occurrences, by location, name, severity, model number, IP address - any attribute that can help you group it into smaller sets.  Grouping and ordering it differently can help you spot trends or opportunities.  When you finally get the right slice, you'll see a clear path forward.  You'll find a commonality that highlights a particular issue or set of related issues.  These are your quick wins.  Quick wins impress everyone, because it makes the team realize it's possible to make progress, even if it's tiny and incremental*.

At some point during this analysis, you'll probably also find that the 80/20 rule applies to your problem - ie, 80% of your problem can be solved by fixing 20% of your stuff.  Let me say that again: You can probably resolve 80% of your issues and only touch 20% of your stuff.  This is your big win, and big wins fuel morale.  Morale fuels momentum, and if you're not careful, you'll find yourself in the middle of a virtuous cycle.

So stop counting and start slicing!

* This type of data slicing is a simple form of OLAP pivoting


Monday, March 4, 2019

It's Time to Train the Trainer

Why is most mandatory corporate training so bad?  They are utterly devoid of humor or character, with nauseating stock photos of smiling "teams" sitting around tables or shaking hands. Students are expected to learn every policy and best practice in 60 minutes, and at the end there is a silly quiz that tests their reading comprehension - along with their patience.
These are the good guys

Why do organizations require training?  Is it because the organization is required to do it?  What is the training for - is it to change people's behavior in order to achieve a specific outcome, or is it just so you can say "I told them not to do that" when something bad happens. Even worse, is it just to tell your regulators that you checked the training box?

Cybersecurity awareness training may be the worst of the lot.  Students are expected to digest a mind-numbing array of concepts, policies, regulations and best practices, from dumpster diving to asymmetric encryption. At the end, there's a "knowledge check" to prove they can regurgitate sections of the security policy on demand.

All the cybersecurity training I've ever seen consisted of citing a dozen legal statutes, then going through phishing, encryption, privacy, password best practices, social engineering, acceptable personal use, permitted devices, appropriate websites, email etiquette, and more.  It's not working.

  • It's too much data for a human to assimilate in an hour.
  • There aren't any real-world examples of how your organization is affected.
  • The student will tune out anything she doesn't feel is relevant to her situation.
  • You don't have any way of determining whether or not the training was successful.  
    • Actually, that's not true - you have the dreaded mandatory survey at the end of the session.

You're missing a rare opportunity by not putting enough thought into mandatory training. Mandatory training is a shared experience and you have everyone's attention for an hour or two. This makes it a unique chance to change the culture of your organization and boost morale by creating a positive experience.  Done right, training can help everyone be better at their job, and yet this opportunity is often squandered because it lacks forethought and clear, measurable goals.
This is the bad guy.

Instead of trying to get your colleagues to memorize cyberlaw, what would happen if you asked HR and your incident response team to tell you the three bad behaviors they most want to eliminate from your organization.  Agree on three things that happen in your shop that you'd like to eliminate once and for all. Get examples of the behavior, and how it impacted you.  Most important, get numbers - how many times did it happen last year.  Focus your entire training on those three things.

For example, if plugging in personal USB drives has caused data exfiltration or virus infections, then quantify it, and train people not to do it.  Provide case studies that show how USB drives have impacted the organization.  Try to make the training fun or at least engaging. Forget the quiz results - if the training worked, you'll see a measurable drop in USB-related incidents.  If you don't, you need to tweak your training.

By taking an incremental, focused approach, you can use your training program to inspire your team to do better work, and help your organization avoid risks.  If the training is created with wit and creativity, it might even boost morale.

It's time to train the trainers.

Saturday, March 2, 2019

Predictions

Can people predict the future? 

The results of tarot or palm readings are often uncannily accurate, even though you know it's random.  One explanation for this is called hindsight bias: essentially you create a story backwards from the outcome to the real or imagined beginning.  We eliminate any details that don't fit the prediction.  Another theory is that your prediction actually drives your subsequent behavior, thus influencing the outcome. 

Either of these subconscious tricks is fine when you're playing with cards, and may even work in your favor when you predict positive outcomes.  The problem is that many of us seem to be biased towards negative predictions - especially when it comes to cybersecurity! 

Be honest - don't you get a covert thrill when you tell a System Owner "I told you so" when they fail to act on your recommendations? 

I think the biggest problem with predictions comes when we make negative predictions about another person's behavior.  Saying or even thinking, "Kumar won't show up to the meeting.  He never comes to our meetings." taints your engagement with Kumar, no matter what. If he does show up, you say "Hmmph, so Jane finally decided to show up".  If he doesn't make it, even if he has a great reason, you get to say "I told you so".  There is no positive outcome possible, making this a zero-sum game.


Even worse, since Kumar didn't show up to the meeting, you may rewrite the history of your relationship with him to create a reason for his non-attendance.  He's flaky, disrespectful, and doesn't care about security.  You may even exclude all the evidence to the contrary, just to support your predictive narrative?

What if you tried flipping hindsight around to give everything the benefit of the doubt?  Always assume the best intentions - "Kumar is always supportive.  Something very important must have come up to make him miss a meeting".  Always predict the best outcomes - "I'm confident we will solve this problem together.  We always do, one way or another".

One final thought:  Hindsight bias can blind you to the things you could have done better on a project that ended well.  You could create a "doomed from the start" narrative on work that has a negative outcome, and wind up losing all the things the team did right.  Although these lessons contradict your narrative, they often turn out to be the most valuable part of an endeavor.
In hindsight, of course.

Friday, March 1, 2019

A Blank Page

What do you do with a blank page?

A blank page is a clean slate, a chance to start fresh, an opportunity to transform the way you express yourself, the way others see you.   You start off with bold letters and new resolutions.  Yet, all too often, your commitment to a radical change of direction wavers, and after a few pages, you're back to your sloppy handwriting and you're back in the same old rut.

How do you stay the course to change? 
How do you keep that fresh feeling fresh?  How do you keep from backsliding?

Is it okay to lower your expectations?  Should you lower the goal posts to make success easier, more achievable?  Do you set a grand goal and try to get there incrementally?    Is it defeat when you admit that you bit off more than you could chew?  Is it cheating if you ask for help?

It's your book, so it's your choice.  Just remember that the supply of blank pages is probably limited.

Patient Gardening

I was pulling weeds in my garden last weekend, and it struck me that there are a lot of parallels between gardening and cybersecurity.  I’m...