You (or your organization or your mission or your family or etc.) pass the “onion test” for honesty if each layer hides but does not mislead about the information hidden within.
When people get to know you better, or rise higher in your organization, they may find out new things, but should not be shocked by the types of information that were hidden. If they are, you messed up in creating the outer layers to describe appropriately the kind-of-thing that might be inside.
Outer layer says "I usually treat my health information as private."
Next layer in says: "Here are the specific health problems I have: Gout, diabetes."
Outer layer says: "I usually treat my health info as private."
Next layer in: "I operate a cocaine dealership. Sorry I didn't warn you that I was also private about my illegal activities."
Outer layer says: "Is it ok if I take notes on our conversation?"
Next layer in: "Here’s the group chat where I mocked each point you made to 12 people, some of whom know you”
Outer layer says "Is it ok if I take notes on our conversation? Also, I’d like to share my unfiltered thoughts about it with some colleagues later."
Next layer in says: "Jake thinks the new emphasis on wood-built buildings won’t last. Seems overconfident."
Passing the test is a function both of what you conveyed (explicitly and implicitly) and the expectations of others. If it’s normal to start mocking group chats, then it doesn’t need to be said to avoid shock and surprise. The illusion of transparency comes to bite here.
Social friction minimization is the default trend that shapes the outer layers of a person or institution, by eroding away the bits of information that might cause offence, leaving layers of more pungent information underneath. The “onion model” of honesty or integrity is that each layer of your personality or institution should hide but not mislead about the layer underneath it. This usually involves each layer sharing something about the kinds of information that are in the next layer in, like “I generally keep my health information private”, so people won’t assume that a lack of info about your health means you’re doing just fine health-wise.
It takes a bit of work to put sign-posts on your outer layer about what kinds of information are inside, and it takes more work to present those sign-posts in a socially smooth way that doesn’t raise unnecessary fears or alarms. However, if you put in that work, you can safely get to know people without them starting to wonder, “What else is this person or institution hiding from me?” And, if everyone puts in that work, society in general becomes more trustworthy and navigable.
I started using the onion model in 2008, and since then, I’ve never told a lie. It’s surprisingly workable once you get the hang of it. Some people think privacy is worse than lies, but I believe the opposite is true, and I think it's worth putting in the effort to quit lying entirely if you’re up to the challenge. Going a bit further, you can add an outer layer of communications that basically tells people what kinds of things you’re keeping private, so not only have you not lied, you’ve also avoided misleading them. That’s the whole onion model.
I have found this model extremely useful in the last few months talking about organizational strategy as a way of carving between “not everyone gets to know everything” and “actively pointing people in the wrong direction about what’s true lacks integrity” and avoiding “I didn’t lie but I knowingly misled.”
So far I have thought about it as a backwards reflecting device - what on the inside would people be shocked to find out, and how can I make sure they are not shocked, rather than forward thinking and signposting all the things I might want to signpost, but I could imagine that changing. (ie right now I’m taking this as a useful quick tool, rather than a full orientation to honesty as Andrew does, but that could definitely change).
In general, over the last few years I have shifted pretty far towards “transparency, honesty, earnestness are extremely powerful and fix a lot of things that can otherwise go wrong.”
On a different note, for me, virtue ethics is attractive, but not real, and tests for integrity are important and useful pointers at things that frequently go wrong and can go better, rather than referenda on your soul. I would guess there are situations in which glomarizing is insufficient, and focusing too much on integrity will reveal the existence of secrets you have no interest in revealing, at least if you are not massively skilled at it.
[Some small edits made, including to the title, for clarification purposes]