Tactical Tech's new "Holistic Security" report


#1

https://holistic-security.tacticaltech.org/

Curious the community's reactions to this.


#2

Appreciate the motivation here: too often, security materials is still tutorials for three and a half software utilities, with a random "you must also consider your physical security" thrown in as a blanket statement and a random pointer to the need for a broader strategy. This is the opposite: it tries to somehow address the human experience of being under threat, while it gives very little practical advice.

To be honest, I really struggle with the outcome: a lot of over-engineered concepts are presented and discussed, with little practical advice to hold on to. It has the feel of a creative writing exercise targeted at other trainers and experts, rather than something that I'd want to study if my day job was getting beaten up by police.

This isn't to say that the encapsulated knowledge isn't valuable (I have huge respect for the folks at TTC), but I struggle to see how this document would replace (or even augment) an in-person strategy process.

Would love to hear more from people who are working in more high-risk environments...


#3

Like you I appreciate the intention but I feel this report this has exactly the opposite problem of tool guides like Security-in-a-box.

It's a work of Theory, if the Audre Lorde opening quote didn't make that clear. And I'm a part-time academic myself so I get it, but I don't see that this would be high on the reading list of someone who has concrete security problems or responsibilities.

I'm certainly not saying PGP tutorials are superior. Rather, we're still missing the sort of mid-level operational advice that new practitioners really need. Something like, here are the greatest threats in your region / in your line of work / against people like you, and here is how you integrate protection into a plan that considers physical, psychological, technical, and legal issues.

I respect and admire Tactical Tech, and they can certainly write whatever tickles them, but I can't help feel that all this work missed an opportunity to really connect with user needs. There's so much important information at different levels of technical detail and contextual specificity... I'm surprised TT would expend so many words on such abstract ideas when people are getting owned by phishing daily.

They are, after all, Tactical Tech.


#4

Yeah, I agree. This is weird. Recognizing that security extends beyond tech for at-risk users is certainly welcome, but the assumption that tech people are the best people to write about those other risks is bizarre. PGP guides may be annoying, but at least the people who write them are providing tangible value in the form of knowledge that the end user may not possess. What do tech people know about physical security? This is a substance-light overreach. The verbosity and gilded white paper format is also strange, for example:

"Our approach to security and protection must take into account the
effects not only of physical violence, but also structural, economic,
gender-based and institutional violence, harassment and marginalisation.
This may be perpetrated by the State, but also by private corporations,
interest groups, non-State armed groups, or even our own communities
and those close to us This can deeply affect our psychological
well-being, our physical health, and our relationships with friends,
family and colleagues."

I'm sure I would get a nice head pat for writing that in a freshman seminar at Oberlin, but how does this "guide" actually help people? Shouldn't the end user develop their own politics? Security principles are universal, whether you're Al Qaeda or Gandhi. There are plenty of people doing important work to improve their lives and societies who don't necessarily subscribe to this hip jargon alphabet soup style of politics that only seems universal when you're inside the NGO bubble.

Overall, one doesn't get the sense from reading this that Tactical Tech consulted much with end users to get their input & experience, before writing this "guide."


#5

I don't mind the NGO political jargon so much, though I can see how it might be off-putting to some. Rather my concern is that -- to disagree with you slightly -- security principles are not really universal. They depend enormously on what you're trying to do and where you are operating, and among all the many "security guides" that have been perpetrated (including by me) there is almost none of this sort of information. It's all either very tool oriented, or very abstract. Ethan, your activist guide is a welcome exception.

Please someone prove me wrong by pointing us all to a fantastic guide based on solid user research?


#6

I don't think we disagree.

It may just be that security is always too contextual to come up with definitive, holistic advice. I realize that this sounds like I'm contradicting myself, since I just said that "security principles are universal" - and they are, but for people who are trying to achieve something in an adversarial environment where they need security, security just needs to be "good enough" to not compromise that objective. As a result, security advice tends to circulate like folklore - within given at-risk groups. Only geeks talk about security generally, or compare security solutions for different at-risk groups, because geeks are doing something fundamentally different: they are either figuring things out from first principles, because they are curious, or they are demonstrating their knowledge in order to seek social status among their peers.

Now, at-risk groups certainly contain nerds who both need security and enjoy it for its own sake (hi), but these people are not the majority. I think that the folkloric circulation of security knowledge within at-risk groups is so common because it is effective (enough) and economical. If some tool, technique, or practice stops working, the group will find out because their adversary will start compromising their peers. This is true of activists, "terrorists," and people fleeing domestic violence. All of these people are at least aware of the risks posed by their cell phones, for example.

An analytic approach to security can be an advantage, because if you take the time to understand the technologies and keep up with adversary capabilities, this can keep you safer (and others who are interested in your ad-hoc research). However, this takes an extraordinary amount of time. Only a nerd with a sincere interest in this stuff would bother. For most at-risk people, priority one is getting things done, and a folkloric approach to security research is sufficient: your peers are the canaries and if they start "dying" (or dying), you figure out why and adapt. This approach is less safe at the individual level, but very economical at the community/population level. After all, this is how we learned not to eat certain foods: people tried dumb things.

Well, you might ask, why don't people just listen to the nerds? If some nerds put in some time and effort to research what is best for their particular community, then everyone can benefit from their security research. Mostly this doesn't work. Nerds are impractical. We recommend Open BSD because we heard it's super secure. We worry about firmware attacks when our adversary is a local law enforcement agency. We plan criminal acts on Signal, smug in the knowledge that our words are wrapped in PFS crypto, prefaced by a ECDH kex, and displayed on a client written by very competent people... and completely neglect the fact that we just created an evidence bonanza: your "friends" will testify against you - at least one of them will probably crack, decrypt their device (if they had disk encryption turned on in the first place), and hand your Signal conversations to the district attorney.

Basically, some nerds are right, some of the time, but even though we understand the principles at a much higher level than our fellow at-risk users, our suggestions are inevitably impractical, because they are not borne of an economy of effort against a chosen adversary. Our suggestions are borne from our love of information and technology itself.

So, should we just give up and go home?

No, I don't think so. Nerds are useful when we are participant observers, embedded with the end user and facing the same risks. This is how you go about turning a hobby brain full of baseball card-style knowledge about security into a person who can make accurate and useful suggestions that take into account the specific capabilities of a chosen adversary, as well as the time and resources of a given user group.

Operational security is a "hard" social science. The basic principles have been there for centuries. What isn't so straightforward is picking a combination of tools and techniques that will offer the "minimum viable" security necessary for the end user to accomplish their objective. That part is more like an art - or evidence based folklore, if you prefer.

After all, operational failure is proof.

Basically, if the people with technical knowledge don't go out into "the field" and experience what is actually practical under given constraints, then their informed advice will be ignored.

P.S. It's not so much that I object to the Tactical Tech authors' politics, instead I resent their implied assumption that everyone who might read or benefit from their advice shares their worldview.


#7

New post from TT clarifying their thinking on their report:

https://holistic-security.tacticaltech.org/news/holistic-security-how-we-got-here

At a gathering of trainers in early 2013, bringing together experts from the three core fields within protection for human rights defenders – physical security, digital security and psycho-social well-being – Craig Higson Smith raised a question which became mantra-like in the later stages of the project:

β€œWhat's the minimum that I need to know about your job, so that I can do my job better?”

This part particularly nails the problem:

effectively teaching digital security is a monumentally challenging task. As shown in our recent research, if it is not taught in the context of real-life scenarios and workflows, within teams or networks, and while providing ongoing support, uptake can be dismal.

And I think this identifies both the strength and weakness of the holistic security approach:

What this means is that Sections 1-3 of the manual (Prepare, Explore and Strategise) focus on creating a framework for analysing and responding to threats. However they do not provide concrete tactics for dealing with specific threats.

It seems they do intend to publish more concrete information, yet they don't see it as particularly valuable:

But while we hope to develop further mini-guides within Act for other commonly occurring, high-risk activities, these will remain a drop in the ocean.

This gets to the heart of my frustration. I don't think anyone here needs to be sold on the idea of integrated security (TT, you had me in 2013). And I think we can all agree that "tool guides" are not the way to approach security. But it's precisely the specific, contextual information about local threats and tactics that is hardest to learn and most immediately valuable, and that's neither a tool guide nor holistic security.

How does a journalist from country X working on Y type of story learn about what might apply to their particular situation? To me, this is the great challenge. The TT community must know an incredible amount about these sorts of problems -- would they be willing to publish on it?