Privacy Economy I



How to measure its value



Have you really thought about what privacy is all about? There is no formal description or legal definition as to human rights to privacy. Politicians, philosophers and business have grappled with what privacy means when it comes to society and rules.

In other words, you are directly and passively trackable. These examples are not futuristic. They exist today. Consider we are not talking about when and how you use any application on the phone as this is transactional data.

-Image courtesy of Shanghai Technology &Science Museum.

As we depend on Internet enabled digital devices, each provides a glimpse or slice of your behavior, creating a less opaque kaleidoscope of your identity. Unwittingly, one is a silent partner in behavioral targeting by others.

Behavioral Targeting is the catalyst for creating value in digital commerce: to connect, identify, engage and transact. It’s mostly driven by advertising but also by how people engage in searching for information, downloading and using applications. All this is measured by the triad of marketing techniques – to record events by recency, frequency and outcome.

- Image courtesy of authenticorganizations.com

Profiling is the first layer of identity which persists over time. The second layer measures relationships with others – the social identity across connected services. Belonging to groups is a key measure of individual behavior. It drives identity formation that populates social networks and sentiments expressed by members along with their contribution to social dialog (posts, comments, tweets, profile and media sharing.) A third layer is best defined by the social graph that measures how connected you are to others, by yor many or few connections, your rank and influence and importance relative to the group.

- Image courtesy of UbiVault.com

Measuring the value of profile targeting depends on which behavior is important in an event. In the example given, data flows have many recipients each with different focus. The value in any data flow is measured by its half-life, namely, how useful the data is per unit of time. The collective economic value is measured by which data attributes combine into different profile slices of an individual. The more invisible one becomes, the less collective data value.
The more private, the weaker the targeting.

Debates defining privacy were affected after 1960 by the development of legal privacy protection. This is when computers made it easier to transfer information from paper to digital storage. Some defended privacy focusing on control over personal information about oneself, while others proposed a broader concept of human dignity. Societies are in a constant struggle with maintaining a balance among groups of individuals that opt for change against others that prefer continuity. Knowing about who belongs to what and their makeup is useful to minimize disruption and future uncertainty.

This privacy equation is unbalanced. On one hand, governments want to know the makeup of their citizens. Surveillance in various forms is a powerful weapon to track someone without their consent. However, surveillance by citizens over organs of government and business is seen as a potential threat. Consider how smartphone cameras have leveled this balance in police actions, protest meetings, and so on. In 1998 David Brinn, in his book “Transparent Society”, argued the challenges of balancing the impact of technology on the fulcrum of privacy and freedom. Individuals have the right to “watch the watchers.”

To the point, what is privacy? Is it defined by the individual or society? If one tries to define it as a synonym (like) as opposed to its antonym (unlike), the result is a hodgepodge of ideas. Technology keeps interfering. To enable social stability, privacy is negotiable, hence it has value. As you reduce the friction of information, privacy is depleted. That’s what technology does, reducing the time to a decision. But there is no contract, social or commercial where the value of privacy is quantified.

Here is a real example, where FICO scores are used to determine credit worthiness of individuals and businesses in the US (full disclosure, the author helped design the system.) Is it a perfect measure? Of course not. Is it trusted? Only that business has accepted FICO as a short hand method to quickly decide whether to loan or sell an economic good. It is used to augment your profile from a financial sense impacting how you are treated in society. It’s a gateway that influences how you live. Was the individual party to this process? Perhaps by the choices they made in how they lived but not in a contractual sense where their information is treated as an asset.

Creating Value from Privacy

Let’s take a simple measure - your value based on your economic net worth. As you increase your worth, preventing others to take advantage is a concern. High net worth individuals are at risk more than people with less to lose. Fundamentally it applies to all. At a lower economic level, people are willing to give up more of their privacy in return for some benefit. This could be government economic support, or commercial treatment such as senior discounts and other rewards. You are willing to give up privacy in exchange for a benefit.

A commercial enterprise is willing to give some benefit in exchange to knowing something about the recipient, be it age, sex, location, family unit and more. Their motive is straightforward: what does it cost to acquire data on a prospect in exchange to providing the best price and terms? Whether this is driven by market share, competitive offerings, or brand loyalty, its about gathering data on customers and prospects. This cost is reflected by advertising, marketing and customer support.

We now have the first part of the value equation. What’s missing is the second part. How do you measure the price (cost) to acquire the data? If information has value, everyone tries to minimize to cost of getting the data, privacy be damned. But with enforced regulations such as the EU privacy initiative, these costs go up. Is the individual’s profile of value to the commercial enterprise? Of course. Does this value transfer directly to the consumer? Not always.

How value is transferred is numerous. Your membership in a supermarket rewards program creates significant discounts. They want you to shop with them. In return, they analyze your shopping habits and purchases to help them manage their product inventory and leverage their contract terms with suppliers. For airlines, mileage and frequency of travel triggers just in time offers and preferred treatment. With financial institutions, your net worth affords different levels of engagement. A pharmacy chain reports your prescriptions and shares that with drug companies.

This list of data profiling is endless. What’s missing is the engagement and notification of the individual. If their data is shared by others in the supply chain, why don’t people benefit for this non-transparent transfer of value? If they were part of the transaction, willingness to share further details on their “profile” would help all parties. It’s a simple redirection of costs for advertising and marketing spent on existing channels, to the benefit of the individual, but in a transparent and auditable way.

The Rise of Data Aggregators

You know the usual suspects - Facebook, Amazon, Apple, Alibaba, Google. Ecommerce aggregators collect and analyze data for their benefit and their partners. The second tier are businesses with a larger market share: Walmart, Target, United, Shell, WeChat, etc. The first group of suspects collects data, on behalf of their advertisers, the core of their revenue. The second group collects data on who they engage with, consumers, vendors, and advertisers. Together they are part of an information economy that uses purchase intent and transactions, in the physical world and online.

Their value proposition is how individual data can be harnessed at the most effective marginal cost to an information user. What we are seeing is a battle to acquire, model and offer information in a digital world. Unfortunately, this is happening invisible to the individual. The Internet is an untrusted environment. Data is exposed without our ability to examine and challenge its veracity. Data collected persists for a long time. A bad review for a restaurant is difficult to expunge if untrue. A product rating does the same thing when negative. As long as the “owner” of the original data is not part of an economic transaction, manipulation of “truth” is endemic. But suppose an individual willingly shares their kaleidoscopic profile for the benefit of a third party. The more accurate it is, the greater its value. How can this be done? It requires a formal definition of privacy data exchange.

Trusting that an event occurred is a critical requirement – immutability. We want notification that the data was used. How valuable the individual’s profile is subject to negotiation – a contract. How we monetize this value on behalf of the individual is important and this will evolve as we measure the value of privacy. Equivalence is a more difficult problem, but it can follow established methods for currency exchanges, derivatives or proxy measures of anything of value. Luckily, two new methods have appeared, blockchains and smart ledgers. Without going into detail here, these are an important foundation for the management and recording of privacy as a commodity.

An Economic Model for the Value of Privacy

The economics boil down to a utility-surplus model. Attributes are organized into sets. Sets follow an ontology that creates formal definition of attribute properties. The W3C effort in semantic ontologies is one example. Within a set, attributes are statistically modeled for the appropriate end use (Z-scores, SVM, Conjoint Measures) Sets are input into a multi-dimensional probability matrix M-D exposed to third parties. Based on the score or measure reflected by the M-D value, a third party can use this information – but at a negotiated price.

Using the M-D measure leads either to a non-event, engagement, or transaction. These individual outcome metrics update a global anonymous M-D datastore as well as the individual’s M-D profile. Using machine learning algorithms, decision weights are updated in the global matrices. Over time, the collective use of measures update their values. This is important because an individual profile may have a value at T1 but later a lower or higher value at Tn.

How is this transferred value used by the Individual? Rather than demonstrating how it’s done today, there are new methods that could apply to a new economic model for privacy. Recall that immutability is a central requirement for information exchange. The ability to assign some “value” to the exchange is the second need. And, converting value into an equivalent good or service is necessary.

Blockchains and Tokens

The rise of alternative currencies such as Bitcoin and Ethereum have exposed the viability of blockchain methods for distributed, secure applications. Without going into details of how blockchains work, they support immutability and security of data without a centralized authority. A blockchain is the plumbing on which crypto currencies rely. In concert with another method – smart ledgers – they provide the mechanism to securely record transactions whose payload is defined by any application.

Now imagine all the data attributes that describe you, stored with you (at the edge), transformed into an M-D matrix of probabilities that measures your affinity across multiple categories. That is your profile, encrypted and not stored by any central entity. The same method of categorization that is used in your local M-D matrix is used in a central datastore, with one proviso. These are aggregated anonymously across many individuals. For someone who wants to query the central M-D matrices, they have the metric of how many individuals belong in the group (affinity.) What they don’t have is which individual belongs to the group. For that you need to a second level of privacy.

Proxy Agents

We connect in various ways through the Internet. At its core are a number of protocol layers and an addressing scheme – IPV4. Because this uses a 232 maximum address pool, an alternative method is rapidly gaining acceptance – IPV6 with 264 addresses, an infinite pool. This permits entities to reserve a block of such addresses for communication. In order to preserve a double-blind privacy wall between profile owner and M-D user, there needs to be a neutral agent or proxy to execute a transaction. For example, an advertiser (user) can send information to an individual based on their profile without knowing who they are. Even when the individual responds, say through a browser, the proxy agent is the only visible IP address to the requestor. The proxy agents are transient in that they exist only for the transaction and are returned to the pool for the next request, without any history of the event.

Anyone trying to track the proxy will have no ability to link that agent to when it was reused by another individual with a different profile. No history, cookies, or other tracking beacons of any use hence the double-blind principle. The M-D repository collects profiles that have a hashed identifier. To reach an individual on behalf of an advertiser, they broadcast a request based on their M-D profile (push). If an individual responds the M-D repository only knows that an agent has transacted with one of their M-D profiles in their datastore, one of many, and has responded but not identifying the unique individual. There are additional steps that preserve anonymity, but the process does not provide information to the datastore about a specific individual.

The current ways of measuring impressions such as click throughs and engagement are unaffected, but privacy is maintained until a transaction is completed. What is different is that this approach lends itself to performing data modeling at the edge with the individual, and not at a central system. The central M-D datastore can be updated based on transaction outcomes without compromising an individual’s profile. One benefit is that it disintermediates the role of a data aggregator which relies on tracking online behavior across web sites and applications. With this approach privacy value is negotiated based on the individual intent or sentiment level at the time of the event. If the individual is just exploring, their value Is lower than if they have an intent to transact. This dynamic pricing method makes for an interesting case as there is an implied transfer of value to the individual to the user of their profile. It is analogous to lead generation pricing as compared with web site impression inventory pricing.

Summary

We explored what privacy means in a limited sense. We propose a claim that privacy can be measured, and a value established. Rather than changing the way businesses budget to advertise and market on the Internet, we offer the idea of reallocating existing budgets to engage the individual rather through data aggregators and web publishers. The idea of using proxy agents to preserve anonymity while helping businesses to find prospects is the primary goal. At the same time, ensuring privacy is paramount. Finally, we propose that companies explore the methods outlined to increase their response levels, improve their brand awareness and employ more sophisticated methods for behavioral targeting, but not at the expense of individual privacy.

Author

Andre Szykier CTO

Ubivault
BlockchainBTM
andre@ubivault.com