What Is “Trust”?
A shared language for people, systems, and policy. 5 minute read
Why define trust now?
We use the word constantly—about leaders, brands, platforms, elections, and AI—but we rarely mean the same thing. In public debate, “trust” can sound like comfort, loyalty, or reputation. In compliance and governance, it should mean something firmer: confidence in expected behaviour over time, backed by evidence.
How different people use the word (and why wires get crossed)
We come to “trust” from different vantage points. Those vantage points shape what we look for—and what we fear.
Interpersonal trust — “I trust you because of your character and how you’ve treated me.”
Signals: honesty, care, consistency, keeping promises.Institutional trust — “I trust the organisation because rules are clear and applied fairly.”
Signals: policy clarity, independent oversight, audit trails, proportional consequences.Systemic/digital trust — “I trust the process and technology because it’s secure, auditable, and resilient.”
Signals: security-by-design, privacy-by-default, explainability, uptime, tested recovery.
When we don’t name our lens, we talk past each other. Someone may praise a leader’s “good heart” while a risk officer asks for breach stats. Both care about trust—just from different angles.
Our working definition at MJC (and at CCI)
Trust = confidence in expected behaviour over time — anchored in integrity and competence, demonstrated through transparency and accountability, and evidenced by results.
The building blocks of trust
Integrity — We do what we say we’ll do, even when no one is watching.
Competence — We have the skills and capacity to deliver, not just the intent.
Reliability — We perform consistently over time; promises survive pressure.
Transparency — We explain how decisions are made and what trade-offs exist.
Accountability — We welcome scrutiny and accept consequences when standards aren’t met.
Care — We design for human impact: dignity, inclusion, accessibility.
Safety — We prevent harm proactively: security, privacy, and well-being by design.
Redress — When things go wrong, people are heard and harm is addressed visibly and fairly.
Trust in systems and data (for tech & emerging sectors)
For platforms, products, and public infrastructure, trust becomes operational. It must be built into architecture and process:
Security & Privacy by Design — encryption, access control, data minimisation, disciplined retention.
Explainability & Auditability — decisions are traceable; logs and evidence exist for review.
Governance Fit-for-Purpose — clear roles, separation of duties, vendor oversight, incident playbooks.
Resilience — backup, recovery, tested failover, realistic crisis simulations.
User Dignity — consent that is real, choices that are meaningful, dark patterns avoided.
How to spot trust: evidence, not vibes
Blend leading and lagging indicators:
Leading: staff retention, psychological safety, training coverage and quality.
Lagging: complaint volumes, response times, redress outcomes; security incidents per quarter and time-to-contain.
Assurance: independent audits passed and remediation closed on time.
What trust is not
Not blind faith — “Because they say so” isn’t proof.
Not brand gloss — a logo can be loved and still fail a privacy test.
Not secrecy first — NDAs and security matter, but silence isn’t accountability.
Not risk-free — mature organisations admit trade-offs and show how they mitigate them.
If this framing helps your team align, feel free to borrow it. If you’d like a one-page “Trust Definition Map” to spark discussion, reply with “Trust Map” and I’ll share it.