Redefining Trust in Digital Engagement
Krowdthink has spent a long time researching and thinking about how to build a business that is fundamentally trustworthy when delivering a digital service. This blog summarizes our implementation. The simplest expression is our Trust Pyramid. Its articulation has been described by some as an ethical business model, we make no such claim but appreciate the perspective.
Too many organizations think of trust as a binary condition in which some customers trust the entity and some do not and the aim is to get a higher percentage to state their trust. But such goal setting subverts the whole objective of trust. It becomes a game, and often times, especially in online terms, its less about being worthy of trust and more about gaining trust often using methods that are not worthy of that trust, a tactic that time will always reveal. What’s needed is a strategy for the company that gets stronger as time reveals the companies true activities. Hence fostering a company culture that seeks the unattainable goal of being trustworthy is the only sustainable approach to addressing the online trust deficit built up by 15 years of business models that gamify the customer engagement process.
There are two primary perspectives of trust that differ but share the commonality of customer/user confidence:
An attitude of confident expectation that one’s vulnerabilities will not be exploited
Confidence that the value exchange is fair and equitable and that loss of trust drives an equivalent/proportional consequence on both parties
The issue of consequence will be addressed later, but is a critical component for the development of trustworthy commercial engagement.
Empowering the User/Customer
Trust is obtained through the commitment to empower the customer/user in the engagement process in an open balanced mutually beneficial way. There are three pillars to empowerment that a business must engage in to foster customer/user trust. In the digital context a useful empowerment thought process is to consider all personally identifiable data, whether directly obtained (name, address etc), indirectly obtained meta-data (how a customer engages, when, where etc), or derived (information created through algorithm or correlation with other data sets), as conceptually belonging to the customer/user. The business is merely its temporary custodian. In the EU the GDPR will legally enforce this perspective.
The business must be utterly transparent with its customer/user about what Personally Identified Information (PII) it obtains from a customer/user and how that data is used. It should be open about how PII is secured at rest and on the move.
We suggest publishing all data types relating to PII with simple descriptions of use and how that data is secured.
The GDPR is driving the evolution of new consent models with consent receipts that could eventually replace this publication process.
It is recognized that the majority of users will have limited ability to assess the veracity of this information, but if all businesses followed this approach journalists will get savvy and assessment methods/entities will evolve to determine whether the data processing meets the standards set out in the rest of this model.
The company should maximize the ability of the user/customer to control what PII is shared with whom, whilst ensuring a clarity of understanding of the limitations of purpose such PII is shared for. This is both a UI issue and a UX issue. UX is more important because the assumptive perspective of the customer/user should always be the one that holds true. Control also means the ability to withhold sharing of PII whilst still operating constructively in the digital engagement process. Of course many online services will not function without some level of PII sharing, but innovative thought about the minimization of what’s needed to deliver a capability will help facilitate simple communication of control over the limited amounts of PII are needed to drive the online engagement function.
Many trust papers talk about accountability as the 3rd empowerment strut for trust. However this often defaults to an understanding that it’s accountability to the law. The law is always the lowest possible trust bar; it’s a codification of privacy and security principles that sets the lowest standard for operation. Worse, it is defined mostly by business and their pro-commercial lobbyists, worse still, rarely does the individual have the financial or technical or legal capacity to pursue their rights under the law. Most privacy lawyers have become risk mitigators for their commercial clients.
Truly trustworthy companies seek to empower their customers/users with the ability to take their business to a competitor when/if their trust is lost in their current service provider. This is an extremely scary statement for most commercial entities who primarily seek customer lock-in, whilst not realizing that trust is the best lock-in of all. In fact the act of coerced lock-in is a degradation of trust and is potentially unsustainable for the business.
To get trust one must first trust. Trust your business to do right by its customers, trust your customers to do right by you in return. When you screw up this trust will be rewarded with flexibility and time to address the screw up.
When trust is lost and one party feels they suffered a more significantly consequential loss than the other, then recovery of trust becomes even harder.
The GDPR creates fundamental new rights for customers to both delete their PII and to move it to a competitor anyway. Far better to embrace these legislative requirements as business trust assets than to fight the law. Empowering users with the simple and unobstructed ability to delete and move their data is an empowering trust asset, as the customer now feels they have, individually and en masse, the ability to inflict consequence on a company and thereby ensure a change of operation in their favour. In this hyper-connected world they can anyway, you may as well embrace it as a business asset.
To deliver against the trustworthy aspiration, one has to foster a culture in the company that consistently strengthens the trust model. It has to cover both technology and process in both development and innovation, as well as in the commercial engagement model.
The 7 principles of privacy-by-design call out aspects of the trust model and security model needed, focused on the context of assuring customer privacy in the digital engagement process.
If properly followed you will end up with apps and services that don’t need privacy settings, because each stage of sharing becomes an explicit opt-in. In fact arguably a privacy setting is a privacy fail!
The explicit opt-in thought process, or consent model, is central to achieving PbD. PbD is not about not sharing, its about being in control of what PII is shared with whom for what purpose, endorsed by a confidence that the service provider protects the user/customer from its own or 3rd party inadvertent or deliberate use of their PII outside of their comprehension.
It needs to be recognized that security is not an absolute. It’s an arms race and every digital company will suffer a breach of security if it operates long enough. So it’s critical this objective is seen in the context of the next objective of data minimization. Data minimization is a solid underlying principle of security design. Keep the value of what PII you hold to a minimum, its as much a liability as an asset.
Consider security before writing the 1st line of code of a system. Adding security later is harder and more expensive, in some cases impossible without re-architecting the system. So architect for security don’t just code for it.
All security relies on a trust point, even encryption. So remember every security system is as fallible as the human processes that surround it. Security is thus a process of privileged access management, both in human terms and coding terms. Hence it has to be inculcated into the culture of all aspects of the business.
User Data Minimization
This principle underpins both PbD and SbD. Given no system can be guaranteed to be secure it is incumbent upon a trustworthy entity to minimize the PII they hold so that when a data breach occurs the consequence to the individual is minimized. Time is the greatest exposure tool for every digital system, even encryption gets progressively easier to circumnavigate as time progresses. In SbD it’s a key principle, understanding what data you hold and how it is used is critical to architecting a security system. Less data ensures less complexity in this securitization process and thus less risk of insecurity in the system.
Don’t obtain PII you don’t need, don’t store what you don’t need to, don’t transmit what’s not needed. These simple rules maximize the potential to keep PII private and minimize consequence of a security breach. Properly embraced they lead to innovations that strengthen the trustworthiness of a company.
No Covert Profiling or Tracking
This principle needs to be called out explicitly because it is the default operation of almost all online services today, whether the simple website or the complex app. Technologists who can ‘cleverly’ obtain information about users, where they are, who they are, what they are thinking, their state of mind etc, all without communicating explicitly to the customer/user that is what is happening is central to the digital engagement culture of developers everywhere. Reversing this trend requires an explicit recognition of it and a topline commitment to not operate that way if there is to be any chance of a constructive reversal in the culture of digital engagement models.
Open Business Model
The default business model of the Internet is the fundamental cause of the spiral downwards in trust in digital engagement. The pervasive sentiment that because I go online the normal rules of society no longer apply is driving dishonest engagement, reluctant sharing and active obfuscation. A digital society that drives these norms is not a constructive one, its destructive. It seeks to set new norms. Yet these norms are not fostered in constructive debate, they are covertly forced upon us and then validated when time exposes them, all in the name of commercial endeavor that primarily benefits the four horsemen of Google, Facebook, Apple and Amazon.
The Internet is still in its infancy, and the current business models were needed to enable innovation as no individual consumer was prepared to pay their way directly, so instead payment has become primarily through profiling of people to allow ever greater insights into their needs/wants to better target the adverts or service. This model will continue well through the Internets teenage years and into its young adulthood. However there is clearly a community of users that desire a more constructive digital engagement model, one in which they are respected and thus can trust their suppliers. The only way this can be fostered is to be totally open about the business models of the new breed of businesses. Allow the business model to be debated and validated, only then can trust be fostered.
Most existing online business models can be flipped, offering users/customers new methods to engage in which they are empowered over what is sold to whom for what purpose.
We at Krowdthink are focused on the engagement model, seeking real-time data use to create unidentifiable groups of people in real-time that share common targetable advertising assets such as location, time and contextual interest. We thus make real-time engagement the revenue asset without exposing any PII nor sustaining any profiling history – advertisers target the app not the people, the app groups people in real-time. Advertising value is determined not by their individual profile/preferences, but by their value as a group, or in our case a crowd. The trick in our case is to not need to profile people in order to group them, they group themselves at a point in time, in a place, around a specific interest. We have the added benefit that value is driven to the place owner first as they are co-located with the Krowd and thus have a real-time contextual capability to deliver advertising engagement value, thus opening up a new engagement opportunity for businesses otherwise forced to digitally engage via the personal profiling business model that our temporally and locationally displaced Internet demands. Managing this advertising process so it does not intrude on the connected experience is as much of a challenge for us in the Krowd app as its is for Facebook or Twitter, however we have a simple ad block tool, an annual subscription to the app can block ads or allow users to tailor and select the ads they want to see. These business models will be implemented by Krowdthink once we have a sustainably large customer base. The business model does not intrude on privacy at all nor does it require users to think differently to existing revenue models internet or app users are familiar with, but its implementation balances their interests in privacy with the need for our business to cover costs and make a profit.