Building security culture on infrastructure teams | Franklin Hu | #LeadDevLondon

Hello. I’m Franklin. I work on infrastructure and security at Stripe,
and I’m here today to talk about building great security culture. I think security is something that is on all
of our minds these days. It seems like every other week there’s some
new critical vulnerability, or some situation where our personal data has been leaked or
stolen, or mishandled. As more of the world comes object line, it’s
critically important that we build products with security as design constraints, so we
will talk a bit about this today. To at Stripe, our goal is to increase the
GDP of the internet. We want to make it really easy to run businesses
that are internet first, and that includes everything from getting incorporated to taking
payments, and things like managing fraud. And so payments is a decent chunk of our business,
and I think that, when we are dealing with money and people’s livelihoods, it’s the security
implications are somewhat obvious, but I think our companies and organisations have a responsibility
to protect the data that we’re ultimately entrusted with. So, building and maintaining trust is one
of these challenges that we’re going to address today. Scaling with growth is another one of them. In start-ups and rapidly changing organisations,
these are dynamic, and sometimes chaotic places. The nature of the business of the product
and the customers are changing pretty rapidly over time. And these are situations where the security
function in a company may not be well staffed, it may be a team, it may be a loose collection
of individuals, and they might not have the bandwidth to actually sort of help the entire
company. Maybe your company has doubled in size over
the past year. That’s great because now you have twice the
number of people who can ship things, but it’s also twice as many laptops that can get
infected with malware. Culture won’t solve these things on its own. But if we encode our values and beliefs about
security into our culture, it does give us a more consistent way of approaching problems. So we are going to talk about three elements
of security culture today: learning and growth, empathy, and responsibility. Not everybody is a security expert. But it is a skill that you can build up over
time. We want to build a space where people can
ask questions, they can try things out. They can fail, and they can fail in a safe
and supported way. Too often, security culture is implemented
through fear and shame. Who here has been told that you should lock
your laptop when you walk away from it? Yes. It’s good advice. Especially if you’re working in a public place. Very often, I’ve seen a lot of organisations
that try to implement this by shaming people who don’t lock them, right? If you step away from your laptop for a minute
or two, sun someone might come up and send an email to the entire company. We don’t want these types of cultural norms
that shame people for accidents. Brene Brown reference things like courage,
shame, and empathy, and she has found that shame causes one of two reactions: either
people tend to disengage or withdraw from situations, or they tend to get aggressive
and try to take control. We don’t want either of these. We want people to come to work as their best
and fu l selves and not be afraid or embarrassed, or ashamed for doing things. If someone accidentally clicks a phishing
link or run the wrong command in production, or leave their laptop unlocked, we want these
to be learning opportunities so that we can change our processes or our tools. We don’t want to institutionalise patterns
where people are shamed for accidents or for doing what they think is the right thing at
the time. So I think that one of the best parts of working
in a diverse and dynamic organisation is that there are going to be situations where people
disagree about things. Maybe you’re lucky enough to go through an
acquisition and have to reconcile years’ worth of differing technical choices. Maybe you’re about to go through a regulatory
audit and you need to change a bunch of work flows for different folks on a different teams. These are situations that we want to approach
with empathy, and ultimately, figure out what the best path for it is. There are some cases where security teams
do need to speak in absolutes, right, you must not use SSLv3. We know it’s vulnerable. These are relatively rare. Instead, we want to focus on empathy and try
to understand where the other person is coming from so we can ultimately mitigate whatever
risks there are and get to that business goal. We want to build a culture where people come
to the table focused on the problems so that we can build towards that shared feature. I think that a big part of this is recognises
that, if there are disagreements, it’s often due to a divergence in the context that those
people or those teams have. We want a culture that’s focused on bridging
the broader organisational context with that of the original team or person. And so lastly, the last element is responsibility. So, security is an integral part of the design
of systems of processes, of features, whether you’re building a photo-sharing app, or an
infrastructure tool, you can’t bolt security on after the fact. It’s incredibly important that everyone sees
security as part of their job. Developers have the most context for the problems
that they’re trying to solve, the pain points that the users see, and so we want to build
a culture where we trust and hold developers responsible to make the right call, and trust
that they will ask for, then. Help when they need it. That said, I think that coming at it from
infrastructure or platform perspective, the hope is that over time we will be able to
make people’s lives easier by building the right tools and abstractions so that they
have to worry about fewer classes of security issues over time. This might be things like libraries or systems
that handle authentication, or eliminating cross-side scripting. So I’m going to posit that of these three
elements, without any one of them, we end up with a security culture where security
is someone else’s job, and that makes it incredibly difficult to build great safe and secure products. All right, so how do we do this? We will cover a couple of strategies for building
a great security culture. The first of these is developing on individuals. Nobody comes into the world as a fully formed
expert in cryptography or reversing malware but it’s something you can pick up over time. A lot of folks I work with don’t have a background
in security – some self-taught and some learn on the job – and there are people who are
interested and they want to learn more, especially in fast-growing organisations, there are lots
of opportunities, and chances to grow. If you’re working in an organisation that
has rigid structures, if you have teams, if those teams have head-count, if the managers
are protective of their head count, that can be some friction in moving throughout the
org and actually working on different types of project. Rotations are a great way to address this. This is when someone spends three to six months
working in a different Italy, building context, developing some domain expertise, maybe building
some relationships, and it gives you a chance to kind of see thing from a different angle. For folks that do a rotation on a security
team, their job might involve things like helping with threat modelling, reviewing other
team’s designs. It might be using a lot of the tooling, and
systems, that security teams deal with on day-to-day that they might have access to
because they have sensitive data or sensitive controls in them, or it might be things like
working on threat operations, like what are the types of threats and attacks that the
company is seeing on a day-to-day basis? How do you diagnose a malware infection and
get it cleaned up? So, let’s say we have a few people on our
teams that are interested in security, and are building up, so some domain expertise,
a common next step is formalise this relationship between them and the security team. Security advocates, or it’s also known as
security champions, are a way to deputise folks throughout the organisation that have
some security knowledge or interest and have them be the point person on their team that
is responsible for the security of that team’s designs and things like that. They can be dependent to escalate larger questions
back to the main security team. And it’s really helpful for high-touch projects
that might be security-sensitive. So we’ve talked a bit about growing individuals
into roles to help support their teams, to help bring a security mind set into day-to-day
conversations, but what can we do at the broader organisational level? We’re going to focus on this idea of building
a culture of learning. One of the most important things is that we
need to emphasise that learning is part of the job, and as leaders, we need to make space
for it in our schedules. Discussion and presentation forms, this is
a fairly widely used concept, it comes up pretty organically. They are usually informal. It might be things like a brown-bag lunch,
it might be people who get together to talk about papers or books that they find interesting. Often these are not well supported by organisations. These might be self-organising, and it’s an
opportunity for if we want to build a learning culture, we should make sure that these are
recognised and sort of incentivised from the perspective of performance reviews and stuff
like that. So the second thing that I’m talking about
here is table tops and game days. So, any systems that we build have goals,
and they have constraints. They won’t handle all types of situations
or environmental changes. Exercises like tabletops and game days are
a great way to share context on a system to explore it and to see where it breaks. So tabletops are an exercise where you take
a real scenario, such as your base images have been compromised, and they’re leaking
sensitive information out, and you talk through a response, sort of in a dungeons and dragons
style. These systems may become complex. They may involve human and technical factors. These are also a really great opportunity
to bring in other teams and work cross functionally. If you take an example like you’ve received
a DDoS threat, right? Often companies will receive emails that day
please pay us ten bitcoins otherwise we will take your site down. Working through these types of exercises helps
to build organisational relationships as well as get through what might be some broken windows
in your processes. If you’re bringing in teams like legal or
regulatory, if they can help things like getting in touch with law enforcement or any sort
of regulatory disclosures. To take tabletop a step further, game days,
which comes from Etsi, are essentially exercises where we take scenarios or inject faults and
take actions in production. You might kill your database master, and you
kind of see does the system react in the way that we expect it to? Do we get paged? Do the alerts go off? These are great ways to spin up new faults
on a team and help understand the limitations, the limits and the edges of a system. This works very well for systems that are
security-focused as well, because, often, those systems have much tight er and better
defined invariants, and there are great resources out there for how to run game days, which
I will link at the end. So a great opportunity for sharing knowledge
is also when a project wraps up. People love reading about things that have
shipped. It’s really energising to see forward progress
in other parts of the organisation. And it’s a valuable time not only to say what
you’ve done but also how you did it, what you learned. At Stripe, we have a couple of mailing lists
called Shipped and Fixed, and everyone is subscribed to where people send updates on
product launches, on bug fixes, on large deals we’ve signed. These aren’t all tech changes but – technical
changes but capture the momentum of the company overall. For security focus on shipped, we try to focus
a lot on why we approached it in a certain way, and why it’s significant. Very often, infrastructure or security work
can become invisible. Often, if you do a migration well, nobody
actually notices. So Shipped emails are a great way to capture
that impact. So our process for handling these has evolved
a bit over time. We have some team-specific lists now. As the company grows, it can be a little daunting
to send an email to everybody that says, “I added a command line flag that makes our lives
a little bit easier.” Team-specific lists lower the bar, and basically
capture more targeted part of the an audience. BBC also found that writing shipped emails
is a great design tool. When you’re in the process of scoping out
a system or a process, it often helps to write a shipped email at that point to kind of help
crystallise both your goals as well as your expected impact. There’s been a lot of situations where in
the process of writing up a preshipped email, I’ve noticed at least that a design can be
broken up into more incremental steps that are tighter scoped and easier to explain and
talk about. Lastly on this topic of learning culture,
there’s a lot that is going on in our industry. We have access to so many different types
of perspectives, and things that people are doing at other organisations. Over last year, Stripe has been building this
magazine called Increment that is a print and digital application that really focuses
on how our industry builds and operates software at scale, and especially some of the human
aspects. We had a recent issue on security, which I
encourage you to check out. There are great articles on threat modelling,
or password culture, and stuff like that. I have a couple of print copies with me, so,
if you’re interested, come and find me at the office hours. So the last thing I would like to touch on
is process. A lot of organisations have security review. I think from the early days, Stripe has always
tried to frame this as a collaborative and iterative process. It’s not a judgment, it’s not a tribunal,
it’s a hurdle you have to clear and there definitely shouldn’t be anyone crying. Because we hold our teams responsible for
the security of their products and their systems, the goal of security review is really to bridge
that context and see if they’ve missed anything. Our process for doing this has evolved a bit
over time. Originally, it might that you would sit down
informally with someone from security and talk through a problem. Nowadays, we ask teams to send in a design
document as a pre-read and we schedule 15 minutes to talk through any questions or follow-ups. This is very much the beginning of a conversation. We want to get involved with security at the
very beginning of a design so that we can to make sure that we are asking the right
types of questions, and that we are ultimately solving the right problem. There are some cases where we do inject a
bit more rigour and we are sharing data externally or onboarding a new partner, it doesn’t go
through this process, but generally, it works very well. Security reviews are also a great place to
invite less experienced members of your team to come and see how the process works to see
the types of questions that the security team is interested in. And just help to get into that mind set. From the security side, it’s really useful
to do these reviews, because it gives you a longitudinal view of the types of problems
that the organisation is facing overall. We’ve staffed numerous engineering projects
that have been focused on things that we discovered that are recurring issues. Great. So it’s pretty much all I have. So, to sum it up: we’ve talked about three
elements of building great security culture. Those are responsibility, learning and growth,
and empathy, and also some strategies for doing this. I think that ultimately, culture is only one
piece of the puzzle and we need to build it in conjunction with the technical systems
in and controls in order to build safe and secure systems. But at the end of the day, I mean, it’s how
we build products is just as important as what we build, and so I encourage you to think
a bit more about this next time that it comes up. And hope you found this helpful. And we would love to hear any thoughts that
you have. Thanks so much!

Leave a Reply

Your email address will not be published. Required fields are marked *