Let’s say your job is to bake pancakes in a pancake restaurant. With good intentions and some skill, you’ll make decent pancakes. Maybe even great ones.
Not so in government projects. In my world, we rarely agree on what a “pancake” is. Or whether we want one. Or if we even like pancakes. In the FDS project, everyone’s using the word, but it means something different to each of us. Add to that: it takes years to bake, involves multiple kitchens, and thousands of cooks.
You might say: Aliza, this is not a fair comparison. We are not baking pancakes. You’re right. But that’s the point. A metaphor is a lie that helps us see the truth more clearly.
Good intentions are deaf and blind
Good intentions alone are deaf and blind to consequences. Usually, it’s the person explaining why something went wrong who uses the phrase:
“I had good intentions,” they say, as they hand you a birthday gift you’ll quietly pass on to the thrift shop the next day.
So intentions don’t map neatly to consequences. With good intentions, very bad pancakes can be baked, or none at all—just unfinished batter going bad in the fridge.
If good intentions were enough, our shelves would be lined with golden, crispy deliverables. They’re not.
Nothing can go wrong
A sentence that stuck with me this week:
“I need a story for the officials—and a guarantee that nothing will go wrong if they join this Federated Data System.”
Just to be sure, I asked: you mean a 100% guarantee?
Yes. No headlines. No scandals. No public fallout. Ever.
There is no such guarantee.
And yet—this discomfort is real. In the Dutch context, federated access control means the user of the data decides who gets access, not the supplier. Legally, that’s where the responsibility belongs. Logically too: suppliers can’t verify all the user’s claims.
But emotionally? That feels… unsafe. And it is, to some extent. But the question should be: is it less safe than the current state? And that’s where fallacies come in. We are biased towards the current state—especially when the future state is—well—uncertain, unclear, unfamiliar.
So imagine: we don’t agree on the recipe. We’re not sure what we’re making. And part of us want a 100% success guarantee, or they won’t even enter the kitchen.
Also, we’re not in one kitchen. We’re in different ones, spread across the country, connected only by ritualized group talks called “meetings”. We wrap our hesitation in questions:
“How exactly would this pancaking thing work?”
“We just need more clarity.”
And still—we all agree that pancakes, if ever baked, would be delicious. That the system needs to change. That there’s no real alternative.
This week, I kept wondering: how do we move forward, in this setup? Separate kitchens, risk aversion, unclear definitions, unspoken fears—and still, the conviction that this is the way forward to modernize how we exchange data responsibly.
So the good intentions need a hand.
The next week, I hosted a session with 20 or so experts. We charted the risks—and the possible paths forward.