Skip to content

Event App Security – Is it taken seriously?

The recent Tory Party Conference app privacy and security breach raises an important question; do apps undergo security testing?

The answer it seems is – very few.

The immediate consequence falls on the end user – in this case several politicians have apparently had to change their telephone numbers – an issue that for any of us is a major disruption for friends family and work, but for a high profile politician the issues go wider to issues such as personal safety and security.

In April 2018 Security Intelligence scanned a generic set of mobile apps (not just event apps) and found 85% had exploitable vulnerabilities.  However what makes it likely that event apps exceed this percentage is that the vast majority are “one off”, built specifically for one event.  They are built on a tight time and cost budget, neither allowing for independent security analysis, or what is generally referred to as Pen (Penetration) Testing. In the case of the Tory Party Conference app, what is clear that there was not even time for proper app testing, as the vulnerability exposed in the CrowdComms developed app needed zero hacking capability! Any end user could do it and gain full account control of another user just by knowing their email address.

Earlier this year we demonstrated our Krowd app and the under development crowd safety and security features to West Midlands Police. They were interested and asked if we’d be ready for the Tory Party Conference – we said No. Simply because we knew we had plans to undergo security testing and they may not be complete in time for that event. Perhaps we were prescient?

Having undertaken an independent pen test, we can attest to its value.  We are a company that works hard to protect peoples privacy (world’s first GDPR compliant event app) and sustain company customer information confidentiality, as such we pay attention to security as the foundation for privacy in a digital age.  Despite this we were alerted to several security weaknesses to fix, and advised of a number of techniques to further enhance the security of the system.  As we are also undertaking product research and extensions for the UK Office of Counter-Terrorism we are reviewed by their programme agency the CTS Division of DSTL in the Ministry of Defence, who reviewed the pen test results and ensured certain additional tests were undertaken to address counter-terror concerns.  Overall this effort took a month of time and over 3 man months of engineering and consultant effort, a timescale which is not visibly costed into any event app price model that we have yet found.

The fact we have innovated one app that works at any event without modification means that these pen test value ascribe to every event our Krowd app is applied to.  The point being, that for the majority of the event app market, their approach to app development do not allow surety or confidence that either your customer data or their personal privacy won’t be easily breached.

More broadly, we are concerned the general app industry is a long way behind understanding the importance of security as a foundational aspect of building any software – especially one that takes data such as location, telephone numbers or other sensitive identifying information.  We should take the Tory Party Conference app example as a reminder to us all – ask your app supplier – “have you done a pen test?” Its the baseline for trust in digital systems.

John Trickett, the Labour Shadow Cabinet Office minister said “How can we trust this Tory government with our country’s security when they can’t even build a conference app that keeps the data of their members, MPs and others attending safe and secure?” to which we say “so tell me your conference app was independently pen tested!”


Krowdthink receives Innovation Award for crowd resilience, safety and security

Krowdthink innovation Award Baton

Krowdthink Ltd, developers of the Krowd® event and venue app for security and fan engagement, announces that it has been awarded best innovation in crowd resilience by Major Events International (MEI).


In front of an MEI audience of world cup organisers for Cricket, Rugby League and Netball, plus major International Games including Invictus, Commonwealth (Birmingham), Lima, Paris, Tokyo and Australia Gold Coast, Krowdthink’s Managing Director and co-founder was passed his award relay baton, symbolic of teamwork and trust.
Krowdthink’s approach is to enhance its venue-based communication and fan engagement platform, the Krowd®, to integrate venue security staff with the crowd they are charged with keeping safe. Krowdthink’s Krowd® innovatively combines a secure and private-by-design engagement app with features to enhance safety and security operations within high foot-fall spaces. It enables threat, event, venue and safety information to be shared via mobile devices, transforming ‘the crowd’ into a ‘virtual sensor’ to effectively identify threats and easily respond via alerts.


Following the Manchester Arena bombing and subsequent terror attacks in London of 2017, the UK Government Office of Security and Counter Terrorism (OSCT) within the Home Office sought to encourage innovation to improve crowd resilience. Key areas of interest from the call included improving the detection of threats from explosives and weapons within a range of high foot-fall crowded places and so reduce the chance of such future attacks happening again. The UK Defence and Security Accelerator (DASA) was tasked with engaging the UK technology community resulting in funding being awarded to Krowdthink to further develop their app which aims to make the crowd a participatory threat sensor and responder.
“We are deeply honoured to have been selected for this award,” said Geoff Revill, Managing Director and co-founder of Krowdthink Ltd “It is further validation that our innovation to extend fan engagement with the opportunity to participate in keeping the crowd safe, has huge potential at the world’s largest sports events.”
Dennis Mills, CEO of Major Events International adds “We selected Krowdthink for this award because of their contribution to one aspect of all major events – safety of those in crowded places. They have come up with an innovative solution which can help reduce concerns about suspicious circumstances and allow a rapid response to emerging risks. The award recognises the depth of their understanding of this vital area and we were delighted to acknowledge their achievement and help raise their profile to help them develop their business.”

About Krowdthink Ltd 

Krowdthink is innovating to disrupt the events sector by building one app that any visitor to any venue or any event can use without pre-registration or configuration, yet which is orchestrated and/or provided security support by the event or security team in the venue. We use existing communication infrastructure to build public intranets that enable digital engagement in the context of the venue or event. Our world leading GDPR compliant implementation of event tech means our solution is privacy respectful of the individual, and business customer confidential too, as we neither know nor collect any information about venue or event users. Krowdthink’s mission is to become the world’s most trustworthy fan engagement and event networking platform.

For more information, visit or, Company logo and Krowd event app logo can be downloaded from


Media Contact:

Phone: +44 117 230 2344

About the Krowdthink crowd resilience research and development team 

Krowdthink’s DASA (Defence and Security Accelerator) and OSCT (Office of Counter-Terrorism) research and development project is ably facilitated by Bar Associates and supported by Wright NoW Solutions, Xebre, Telaugos Solutions Ltd and Right Objective.

Our Concerns on Blockchain – Please prove us wrong

before reading on view this excellent Youtube TedX talk on the value and benefit of Blockchain:

Note the speakers view on uncertainty addressed thru

1.Transparency of who you are dealing with, user controlled personal identity tool, visibility of transaction chain
2. Control of what aspect of identity you share to enable a transaction
3. Remedy/Accountability – she related back to transparency/control (bit weak as it assumes legislation actionable response capacity on part of the individual) but right idea
These 3 elements are the building blocks of Trust.
She said “What keeps the blockchain verified is our mutual distrust” – “converting uncertainties into certainties”
All good…now the downside – remember – blockchain is a PUBLIC distributed database:
a. Blockchain is in a massive hype cycle mostly based on most peoples lack of understanding of the underlying tech.
b. The tech relies on encryption as the TRUST point – if we 100% trust the encryption we can 100% trust blockchain – I guarantee time undermines every encryption based trust point. Because time allows compute capacity to exceed encryption resistance.  Not an issue if the blockchain encryption can be revised/updated – I have looked…no-one has yet shown me how this can be done.  Architecturally blockchain has not facilitated this which means from a security PoV its fundamentally flawed.
c. All blockchains use hashing, a one way encryption technique that is extremely hard to de-encrypt.  So time is on blockchains side, it won’t get hacked anytime soon.  But therein lies its medium to long term issue – if point (b) is not addressed as humans tend to forget that clock is ticking.
d. A dedicated quantum computer (already at 51 qbits) could break this hash today!  if a government body decided to dedicate a quantum computer to it – and why wouldn’t they if economies become based on it?
e. Unfortunately though, some blockchain systems use standard encryption as part of their infrastructure – in which case time is shorter for those block chains to become untrustworthy for some aspect of their service
f. The management of a blockchain always needs a security infrastructure – and as yet every one we have seen is effectively a centralised solution, thus undermining the distributed security value of blockchain, and where blockchains have been hacked to date, this is where the hacks happen, because such solutions are always hackable….refer back to (a) and the hype cycle.
Imagine a world built on blockchain for all transactions and then encryption gets broken – not a world we want to be in when those consequences fall.
That said – for digital business to flourish we need to find ways to trust the security infrastructure and blockchain is the best we have seen so far, but we really want to see the fundamental flaw of unupdateable blockchain hashes to be fixed before betting our customers trust on it while this hype cycle (which creates undue expectations on customers behalf) is in progress.
We could be wrong, we hope we are – but we have spoken with 6 different “experts” on blockchain and not one has yet demonstrated to us that our concern is unfounded.

A Business Philosophy for a Better Internet

When Digital meets Real - Trust is more important

In a world where digital and real-life are every more closely entwined, what is a better Internet? 

Simply put, it’s a place I can go and trust that I can lead a digital life that is as free as my real life is. However our digital freedoms, and consequentially our real world freedoms, are undermined by one simple thing, the commercial. reality of funding the Internet

“We need a new business model for the Internet.”

If we take a small step back in time, a mere 30 years, to the birth of the Internet, visionaries in Silicon Valley and elsewhere saw its awesome potential. But the sheer scale of the opportunity had to be funded somehow. It was recognized data was the key, and personal data in particular could be a new tradable asset. But who would buy it? The simple answer – advertisers. They spent $0.5 Trillion in 2016 and this is expected to be $0.73 Trillion by 2020! Of course this is just the tip of the personal data value iceberg, with Apple and Amazon combining this data value into shopping and device service of equivalent revenue scale. Apple + Amazon + Google + Facebook 2016 revenue, when combined, is greater today than the GDP (Gross Domestic Product) of 88% of countries, that’s more than 176 of the world’s 196 countries! These digital entities dominate in their various spaces and operate near monopolies over our personal data and through that our digital lives and thus our forming digital society. I would argue this advertising (and thus personal profiling based) business model was a necessity 30 years ago, even 20 years ago, because no-one was personally willing to pay directly for services for which they had no comprehension of the value. Yet now we cannot conceive of living our lives without them. Try committing to not use just 2 of these 4 companies products ever again – its virtually impossible for the average person on the street to imagine.

The issue is that these services, and thus our personal data, now power entire economies. Governments are unwilling or unable to challenge these dominants because they’d have to undermine digital service to the voting citizen. In short they are stuck and have to engage in the process of harvesting our personal data, or at least participate indirectly, by directly funding advertising through these entities in order stay in or gain power – the recent debates on Brexit and Trump campaigns, show how paid for digital influence by governments is suborning democracy, raising serious cause for concern for our freedoms as citizens.

The trust-undermining issue with advertising as an Internet funding model is that it’s highly competitive. The only way for one digital agency to succeed over another is to provide greater targeting – which in turn requires ever more detailed profiles of who we are, where we are, when we are there and even better, predictive analytics of what we’ll do next, all obtained using ever more deceitful techniques. Facebook now profiles its users in over 52,000 different ways! The advertising model can only grow by delving ever deeper into our personal data. The digital dominants sustain their position by knowing more about you than anyone else in their sphere of business. This means your social lives (Facebook), your purchasing power/habits (Amazon), your interests (Google search) and your physical movements (Google Android and Apple iOS) and a lot more.

“Advertising drives a commercial and economic need to digitally model every aspect of the real world and our real lives.”

The initial advertising funding model was developed to solve a problem, that problem has now been solved. Everyone ‘gets’ the value of the Internet. So why do we persist in pursuing the same old commercial model when the harms to our freedoms are becoming so apparent? All this data stored in the cloud is fundamentally insecure, no one can assure you or me that it can be maintained for the rest of our lives, yet this is what Apple, Facebook, Amazon and Google would have you believe, despite the daily insecurity evidence to the contrary. Even if it was secured 100%, you and I as citizens have no idea who has it or for what purpose it’s being used, fundamentally undermining our privacy, and through that our very trust in Internet service provision which is demonstrably declining every year.

So what has changed that can allow a new business model to develop? Perversely, it’s the very success of the Internet. Digital is everywhere, in some countries more people have a smartphone than have a toilet or access to running water. Digital has become integral to our daily lives, it’s almost a necessity, certainly for a functioning economically progressive society able to compete in the global market.

In some parts of the world (the EU especially taking a global lead) digital businesses are facing a tsunami of legislation. We at Krowdthink are big fans of the GDPR and have great hopes for the E-Privacy regulation in development, as well as standards like eIDAS and other regulations. Not because regulation is good, but because they are trying to inculcate trust and freedoms into digital society. It has also become necessary to curb the slide towards distrust in digital services, but with a stick not a carrot. Business won’t ‘really’ change unless it sees commercial value and opportunity in changing – there are too many businesses that have faced certain demise if they don’t change and yet persisted with the same old models. In short the Internet itself is ripe for disruption – and disruption won’t come from technology, it’ll come from a change in business model and digital market organization.

I have worked in real-time control systems for over 25 years and what we at Krowdthink see is that the Internet, and indeed the burgeoning Internet of Things, has moved from a transactional IT style infrastructure towards a real-time system of systems. Data is flowing continuously everywhere – personal data is often referred to as our digital exhaust – as we leave trails everywhere we go and in everything we do. But an exhaust is not a left-over to be stored and processed, its live information that can facilitate business opportunity. Through research and real world application we have a fundamental understanding of how people interact. The Internet has given us this understanding. Safely executed research can give us the analytical tools to use data in real-time without the advertisers need to subsequently track and profile all we do.

“What has been missed is the opportunity to build a new business model for the Internet based on real-time information flow. We don’t need to profile, track and model every single person on the Internet in order to deliver them digital value.”

Collating historic profiling data puts people at risk to the nefarious criminal element that are showing their strength vs the weakness of the Internet. The criminals’ goal is access to personal data for subsequent fraudulent activities, or the denial of access to data for service denial. Our data is both the strength and weakness of the Internet. What if we did not collect data? what if we minimized what was stored to almost insignificant levels? What if we built transactional value on real-time data – information volunteered and made available at a point in time in order to gain access to digital value? What if we re-invented existing services using this model? We at Krowdthink believe strongly that many (not all), internet services can be re-invented this way – and by doing so, by making the data attack surface smaller, we will end up with a better more trustworthy Internet, an internet where we’ll all engage more confidently and actually share more.

“Let’s monetise the engagement opportunity – not peoples personal data”

Redefining Trust in Digital Engagement

Krowdthink has spent a long time researching and thinking about how to build a business that is fundamentally trustworthy when delivering a digital service. This blog summarizes our implementation. The simplest expression is our Trust Pyramid. Its articulation has been described by some as an ethical business model, we make no such claim but appreciate the perspective.

trust model


Too many organizations think of trust as a binary condition in which some customers trust the entity and some do not and the aim is to get a higher percentage to state their trust. But such goal setting subverts the whole objective of trust. It becomes a game, and often times, especially in online terms, its less about being worthy of trust and more about gaining trust often using methods that are not worthy of that trust, a tactic that time will always reveal. What’s needed is a strategy for the company that gets stronger as time reveals the companies true activities. Hence fostering a company culture that seeks the unattainable goal of being trustworthy is the only sustainable approach to addressing the online trust deficit built up by 15 years of business models that gamify the customer engagement process.

There are two primary perspectives of trust that differ but share the commonality of customer/user confidence:


An attitude of confident expectation that one’s vulnerabilities will not be exploited


Confidence that the value exchange is fair and equitable and that loss of trust drives an equivalent/proportional consequence on both parties

The issue of consequence will be addressed later, but is a critical component for the development of trustworthy commercial engagement.

Empowering the User/Customer

Trust is obtained through the commitment to empower the customer/user in the engagement process in an open balanced mutually beneficial way. There are three pillars to empowerment that a business must engage in to foster customer/user trust. In the digital context a useful empowerment thought process is to consider all personally identifiable data, whether directly obtained (name, address etc), indirectly obtained meta-data (how a customer engages, when, where etc), or derived (information created through algorithm or correlation with other data sets), as conceptually belonging to the customer/user. The business is merely its temporary custodian. In the EU the GDPR will legally enforce this perspective.


The business must be utterly transparent with its customer/user about what Personally Identified Information (PII) it obtains from a customer/user and how that data is used. It should be open about how PII is secured at rest and on the move.

We suggest publishing all data types relating to PII with simple descriptions of use and how that data is secured.

The GDPR is driving the evolution of new consent models with consent receipts that could eventually replace this publication process.

It is recognized that the majority of users will have limited ability to assess the veracity of this information, but if all businesses followed this approach journalists will get savvy and assessment methods/entities will evolve to determine whether the data processing meets the standards set out in the rest of this model.


The company should maximize the ability of the user/customer to control what PII is shared with whom, whilst ensuring a clarity of understanding of the limitations of purpose such PII is shared for. This is both a UI issue and a UX issue. UX is more important because the assumptive perspective of the customer/user should always be the one that holds true. Control also means the ability to withhold sharing of PII whilst still operating constructively in the digital engagement process. Of course many online services will not function without some level of PII sharing, but innovative thought about the minimization of what’s needed to deliver a capability will help facilitate simple communication of control over the limited amounts of PII are needed to drive the online engagement function.


Many trust papers talk about accountability as the 3rd empowerment strut for trust. However this often defaults to an understanding that it’s accountability to the law. The law is always the lowest possible trust bar; it’s a codification of privacy and security principles that sets the lowest standard for operation. Worse, it is defined mostly by business and their pro-commercial lobbyists, worse still, rarely does the individual have the financial or technical or legal capacity to pursue their rights under the law. Most privacy lawyers have become risk mitigators for their commercial clients.

Truly trustworthy companies seek to empower their customers/users with the ability to take their business to a competitor when/if their trust is lost in their current service provider. This is an extremely scary statement for most commercial entities who primarily seek customer lock-in, whilst not realizing that trust is the best lock-in of all. In fact the act of coerced lock-in is a degradation of trust and is potentially unsustainable for the business.

To get trust one must first trust. Trust your business to do right by its customers, trust your customers to do right by you in return. When you screw up this trust will be rewarded with flexibility and time to address the screw up.

When trust is lost and one party feels they suffered a more significantly consequential loss than the other, then recovery of trust becomes even harder.

The GDPR creates fundamental new rights for customers to both delete their PII and to move it to a competitor anyway. Far better to embrace these legislative requirements as business trust assets than to fight the law. Empowering users with the simple and unobstructed ability to delete and move their data is an empowering trust asset, as the customer now feels they have, individually and en masse, the ability to inflict consequence on a company and thereby ensure a change of operation in their favour. In this hyper-connected world they can anyway, you may as well embrace it as a business asset.

Development Principles

To deliver against the trustworthy aspiration, one has to foster a culture in the company that consistently strengthens the trust model. It has to cover both technology and process in both development and innovation, as well as in the commercial engagement model.


The 7 principles of privacy-by-design call out aspects of the trust model and security model needed, focused on the context of assuring customer privacy in the digital engagement process.

If properly followed you will end up with apps and services that don’t need privacy settings, because each stage of sharing becomes an explicit opt-in. In fact arguably a privacy setting is a privacy fail!

The explicit opt-in thought process, or consent model, is central to achieving PbD. PbD is not about not sharing, its about being in control of what PII is shared with whom for what purpose, endorsed by a confidence that the service provider protects the user/customer from its own or 3rd party inadvertent or deliberate use of their PII outside of their comprehension.


It needs to be recognized that security is not an absolute. It’s an arms race and every digital company will suffer a breach of security if it operates long enough. So it’s critical this objective is seen in the context of the next objective of data minimization. Data minimization is a solid underlying principle of security design. Keep the value of what PII you hold to a minimum, its as much a liability as an asset.

Consider security before writing the 1st line of code of a system. Adding security later is harder and more expensive, in some cases impossible without re-architecting the system. So architect for security don’t just code for it.

All security relies on a trust point, even encryption. So remember every security system is as fallible as the human processes that surround it. Security is thus a process of privileged access management, both in human terms and coding terms. Hence it has to be inculcated into the culture of all aspects of the business.

User Data Minimization

This principle underpins both PbD and SbD. Given no system can be guaranteed to be secure it is incumbent upon a trustworthy entity to minimize the PII they hold so that when a data breach occurs the consequence to the individual is minimized. Time is the greatest exposure tool for every digital system, even encryption gets progressively easier to circumnavigate as time progresses. In SbD it’s a key principle, understanding what data you hold and how it is used is critical to architecting a security system. Less data ensures less complexity in this securitization process and thus less risk of insecurity in the system.

Don’t obtain PII you don’t need, don’t store what you don’t need to, don’t transmit what’s not needed. These simple rules maximize the potential to keep PII private and minimize consequence of a security breach. Properly embraced they lead to innovations that strengthen the trustworthiness of a company.

No Covert Profiling or Tracking

This principle needs to be called out explicitly because it is the default operation of almost all online services today, whether the simple website or the complex app. Technologists who can ‘cleverly’ obtain information about users, where they are, who they are, what they are thinking, their state of mind etc, all without communicating explicitly to the customer/user that is what is happening is central to the digital engagement culture of developers everywhere. Reversing this trend requires an explicit recognition of it and a topline commitment to not operate that way if there is to be any chance of a constructive reversal in the culture of digital engagement models.

Open Business Model

The default business model of the Internet is the fundamental cause of the spiral downwards in trust in digital engagement. The pervasive sentiment that because I go online the normal rules of society no longer apply is driving dishonest engagement, reluctant sharing and active obfuscation. A digital society that drives these norms is not a constructive one, its destructive. It seeks to set new norms. Yet these norms are not fostered in constructive debate, they are covertly forced upon us and then validated when time exposes them, all in the name of commercial endeavor that primarily benefits the four horsemen of Google, Facebook, Apple and Amazon.

The Internet is still in its infancy, and the current business models were needed to enable innovation as no individual consumer was prepared to pay their way directly, so instead payment has become primarily through profiling of people to allow ever greater insights into their needs/wants to better target the adverts or service. This model will continue well through the Internets teenage years and into its young adulthood. However there is clearly a community of users that desire a more constructive digital engagement model, one in which they are respected and thus can trust their suppliers. The only way this can be fostered is to be totally open about the business models of the new breed of businesses. Allow the business model to be debated and validated, only then can trust be fostered.

Most existing online business models can be flipped, offering users/customers new methods to engage in which they are empowered over what is sold to whom for what purpose.

We at Krowdthink are focused on the engagement model, seeking real-time data use to create unidentifiable groups of people in real-time that share common targetable advertising assets such as location, time and contextual interest. We thus make real-time engagement the revenue asset without exposing any PII nor sustaining any profiling history – advertisers target the app not the people, the app groups people in real-time. Advertising value is determined not by their individual profile/preferences, but by their value as a group, or in our case a crowd. The trick in our case is to not need to profile people in order to group them, they group themselves at a point in time, in a place, around a specific interest. We have the added benefit that value is driven to the place owner first as they are co-located with the Krowd and thus have a real-time contextual capability to deliver advertising engagement value, thus opening up a new engagement opportunity for businesses otherwise forced to digitally engage via the personal profiling business model that our temporally and locationally displaced Internet demands.  Managing this advertising process so it does not intrude on the connected experience is as much of a challenge for us in the Krowd app as its is for Facebook or Twitter, however we have a simple ad block tool, an annual subscription to the app can block ads or allow users to tailor and select the ads they want to see.  These business models will be implemented by Krowdthink once we have a sustainably large customer base.  The business model does not intrude on privacy at all nor does it require users to think differently to existing revenue models internet or app users are familiar with, but its implementation balances their interests in privacy with the need for our business to cover costs and make a profit.

A Philosophical View of Digital Society

We often get asked by investors, why don’t you take the easy option, you have a solid idea for localised digital engagement, just get the data, sell it and make us all lots of money.

The short answer comes back to this – do we want to contribute to a digital society where this is the modus operandi? Is this the digital society we want to leave behind for our children?  The short answer is no.

But we have to be realistic – there are two tsunami’s of pressure that make it difficult to stand against this sort of way of commercialisation of online engagement.  The first is that the economic imperative will always pressurise digital societal and individual rights – this is as true for the way governments operate as it is for the individual business.  The law is usually insufficient to defend against this pressure because it is retrospective in the face the rapid change in technology, and it’s written by humans, so its open to interpretation by those with funds to do so, and many citizens don’t have the intellectual or financial capacity to exercise their rights. Commercial and economic pressure pushes hard towards favouring the economic opportunity.

The second issue is that commercial law trumps privacy law. I don’t mean explicitly in law, I mean in terms of implementation.  Every director of every company has an overriding imperative to return a profit or financial return for its shareholders.  In digital society personal data is the currency traded to deliver on this imperative.  The law constrains some uses of this data, but the insignificant fines and the weaknesses in the enforcements structures nationally and especially internationally, means that personal privacy has become a cost of doing digital business.

But consumers are starting to become informed about how they are traded online and there is a growing resistance to it.  It’s mostly reflected in the media as security breaches, big businesses are aware its becoming a trust issue.  But there is a big change coming – the General Data Protection Regulation is seeking to get ahead of the laws traditionally retrospective position in digital society.  There are real consumer powers in it and massive fines for those businesses that do not comply.

The innovation opportunity is not to just embrace the regulation as a positive construct, its to recognise that the commercial status quo of digital engagement is about to change, and that change can be accelerated by innovators embracing the underlying principles of privacy en route to constructing a better digital society.  In short to compete on the basis of privacy value and trust in digital engagement.  We can disrupt if we do this.

The only answer to the two commercial pressures outlined previously is to make privacy an asset worth more than covert trade in our personal digital assets.  When that happens governments and businesses will combine to create a better digital society for our children.  This is what we strive for.  It’s why I am driving the creation of an event called Privacy: The Competitive Advantage. Hope to see you there.



Privacy Settings are a Privacy Failure

The EU GDPR (General Data Protection Regulation) being written into statute this month explicitly calls for those storing or operating on personal data to follow the 7 principles of Privacy by Design.  The 2nd principle of which is “Privacy as the Default Setting”.

If you follow the simple logic that all operations on your or my personal data are private by default, then really, there is no need for privacy settings – none. In fact the number and complexity of privacy settings can be directly correlated to the inherent lack of privacy in the platform or product you are using, generally driven by the platform providers business model of the monetisation of you.

As an app developer who fully embraces these principles, it is notable that our Krowd app has no privacy settings function in the app.  By starting with respect for peoples data such that we treat it as if owned by the individual, which means maintaining provenance of all data and meta-data and derived (analytic) data, then every share has to become an explicit opt-in decision by the user, plus the app interface should make it clear what is being communicated with whom for what purpose. This is the essence of privacy.  Privacy is a function of control of what is shared with whom and why, it is not a lack of sharing.

Maintaining provenance also allows us to follow another GDPR principle – the right to delete.  Something incumbent platform providers will find almost impossible to implement in principle without having tracked provenance.

When the business model of a social platform, like Facebook, is to monetise who you are, then they have to start with the basic assumption that everything you share belongs to them not to you, as does how and when you share (meta data) and what information can be derived (through analytics) from the aggregate of all this data. Hence Facebook’s privacy policy makes it clear, all your data belongs to them.  They use privacy settings as a means to tick legal box requirements and to give a limited level of control over some of the data (none of the meta data or derived data, something that will get challenged in the future we suspect) back to the individual. On the flip side this also means that there is literally an infinite number of ways they might use your data in a manner that you may feel breaches your privacy.  Hence their platform, challenged by the latest GDPR legislation, ends up with an ever increasingly complex set of privacy settings – a list that will only get more complex over time, eventually (if it has not already) defeating the very objective of user empowerment with control over the use of ‘their’ data through those very settings.