Trustworthy allies

In my forthcoming book, I talk a lot about tech, trust, and cooperation as important for technology governance and building trust in innovation.

That being said, it’s been a few years since we’ve seen a good tech “alliance”. So, I’m excited to see what Microsoft, et al., put together with the new “Trusted Tech Alliance” announced at the Munich Security Conference. There’s definitely reasons to be optimistic here. If you’ve followed my work on digital trust, some of the principles of this new alliance will be familiar:
*Transparent Corporate Governance and Ethical Conduct
*Operational Transparency, Secure Development, and Independent Assessment
*Robust Supply Chain and Security Oversight
*Open, Cooperative, Inclusive, and Resilient Digital Ecosystem
*Respect for the Rule of Law and Data Protection

These are great principles – transparency, ethics, security, resilience, auditability and good oversight. For years, we’ve known that those help ensure the trustworthiness of digital systems. I’m very hopeful that this will get some traction and the community will expand. It’s rather a sad time that they specifically have to call out “respect for the rule of law” as a principle, but the recent history of tech and of the US federal executive branch makes it necessary. I’m hoping that some of the signatories to this new alliance take that seriously, because I haven’t seen too much commitment to the rule of law from some of them. Perhaps a tiger can change its stripes.

We need to move beyond principles to standards and I think the execs leading this effort understand that. Principles are important, because they give people something to weigh their actions against, but a lot more needs to be done in order to get to the place where the principles can help earn trust. What needs to happen next – and it needs to be communicated as transparently as these principles – is operations and assessment. The big questions this alliance leaves open are:
*What are these companies going to actually do to show they are adhering to these principles?
*How will we know that they are taking those steps?
*What standards of practice or technical standards show progress against these principles?
*How are we going to be able to assess these companies and what can we do when they fall short?

Will certainly be watching this space.

Dialogue, Silence, Tech, & Trust

Ryszard Kapuściński, the Polish anti-colonialist journalist and poet, wrote: “Silence is a signal of unhappiness and, often, of crime. It is the same sort of political instrument as the clatter of weapons or a speech at a rally. Silence is necessary to tyrants and occupiers, who take pains to have their actions accompanied by quiet.”

There’s a lesson here about technology and trust: Who gets to be part of the conversation, and who gets silenced? With the immense power of new tech concentrated in the hands of a small number of technology owners, this is not an abstract question. It will determine whether tech erodes trust or whether we can earn trust.

Transparency or Dialogue?

Tech can be used to earn trust by increasing transparency and fairness. Or it can be wielded to destroy trust, by enforcing silence. Unfortunately, I’ve seen far too much of the latter in recent weeks. The problem begins with how we talk about dialogue itself. Opening up honest dialogue is certainly a part of transparency. But that dialogue has to be honest and focussed around understanding the truth. What we’re seeing instead is a troubling shift: the prioritization of dialogue as performance over actual transparency. Transparency demands the truth of what is being shared. Dialogue, stripped to its bare mechanics, demands only words.

When the owners and builders of new technology aren’t honest about its benefits and drawbacks, that erodes trust. When those owners create spaces where only their voices are amplified and others are quieted, that erodes trust further. When people witness technology being deployed to benefit a privileged few at the cost of the many, when they watch as tech is weaponized to silence inconvenient voices—that isn’t just unfair. It’s dangerous. And it threatens both technology and society.

For the past year, and especially at January’s World Economic Forum’s Annual Meeting in Davos, Switzerland we’ve watched tech CEOs and billionaires celebrate their rising influence, increasingly finding themselves in positions of political power and favor. Their confidence is palpable. Their platforms are vast. And their willingness to use those platforms selectively by amplifying some voices while marginalizing others has become impossible to ignore. The difference between private celebration and public suppression couldn’t be more stark. It couldn’t highlight the fundamental unfairness embedded in our current tech infrastructure more sharply.

Control and Silence

Kapuściński again: “Silence has its laws and its demands. Silence demands that concentration camps be built in uninhabited areas. Silence demands an enormous police apparatus with an army of informers.” Looking around, an Minneapolis and elsewhere in the US, I have to ask, what does it mean when the technology we develop becomes part of that silencing apparatus?

The difference between the “dialogue” on the promenade of Davos and the repression on the streets of Minneapolis couldn’t be more stark. It couldn’t highlight the unfairness of tech (and society) any more sharply. On the largest scale, social media, increasingly powered by AI, is shattering what’s left of our global digital commons. In the warring narratives of Minneapolis, the US federal government used whatever technical savvy at its disposal to elevate grudges and lies over truth and responsibility. As if seeking to embody this trend, TikTok, under new, ostensibly American and certainly less competent ownership, turned its addictive algorithm sharply to the right. On an individual level, the surveillance state we’ve been warning of for over a decade now allows for personal revenge against the administration’s enemies, with facial recognition used to both take away privileges and track down new victims for ICE.

You can’t earn trust without listening to others, enforced silence breeds resentment and distrust. Using technology, especially the very tools that promised to connect us and democratize information, to create that enforced silence is a betrayal of innovation and of our best hopes for the future. Responsibility – for truth, for honesty, and for innovating for trust – can only be postponed, and people will only be silenced for so long.   

This hiatus was much longer than expected

In the meantime, I did, in fact, keep busy. Just not on this blog.

After spending some time at something like an international organization, with a strict outside activities policy (!), I’m free and feeling comfortable getting back to this blog. It’s been a long and difficult road, but I learned a lot and hopefully helped some people along the way.

You can catch up on my brand new personal (like, personal-professional, we are not that close) website here:
https://www.dobrygowski.com

I’ve also written a book! It’s called Technology Governance: Build Trust into Digital Innovation! It will be published in May 2026.

Preorder it today and be the first kid on your block who understands what Technology Governance needs to look like if we’re going to make it through our current adventures in technology, law, and policy.

Preorder here:
on the publisher’s site (KOGANPAGE25 for 25% off):
on Bookshop.org
on Barnes & Noble
and if none of those work for you, on Amazon.