rooting around for grubs in diverse soils

Tag: privacy

Whose privacy?

Jaap-Henk Hoepman’s Privacy Is Hard and Seven Other Myths: Achieving Privacy through Careful Design is a compact and attractively designed book that aims to do two things: scotch a handful of myths about privacy, and make a positive case for how digital technology can protect it.  For the author, digital technologies appear like the reprogrammable T-800, able to pivots from menace to humanity in the first Terminator film to being its (not quite) saviour in the second.

The myths are plentiful, and Hoepman’s selection is as interesting as what does not make the final cut, such as the myth of the privacy paradox. Privacy has been commodified: the cost of a ‘data privacy solution’ starts from $50 a month for some cookie pop-ups to few hundred dollars a month if you want seamlessly to ‘gain user consent’. The fact that there is competition over who cares the most about your privacy would suggest we have come some way since Zuckerberg pronounced its death as a ‘social norm’, but the notion still has widespread currency.  Another myth is that data are somehow abstract weightless, invisible and odourless elements stored in a non-physical cloud. Data means computing power which relies on extracted materials from the earth for building the hardware and generating the electricity, and human labour usually poorly remunerated early in the supply chain. Still another myth is that privacy violations are only committed by other people. This is the essential logic of the marketing message accompanying Google’s Sandbox, Facebook’s exclusion of third parties from its APIs, and Apple’s App Tracking Transparency. In none of these initiatives, as far as I can tell, were there any undertakings by these gigantic companies in question to reveal or improve their own data practices.    

Meanwhile, Privacy is Hard hints at a number of underlying questions.

First world problems

First, whose privacy? Today’s digitalised society of course affects the rights to privacy and data protection, but it does so unevenly. This unevenness is a function of the vast and growing inequalities within and between societies.  The Global South is unfortunately often absent from these discussions, which is inexcusable given the growing body of scholarship now available –  Achille Mbembe or Nanjala Nyabola to cite just two – plus the empirical analyses contained in the UNCTAD Digital Economy Reports of recent years. They reveal populations of poorer countries being farmed for their data by Chinese and US tech companies in exchange for connectivity, electronic IDs and various other gadgets and services. Migrants are the objects of tracking and biometrics technology and will become more so our environment deteriorates and geopolitics become increasingly volatile in the coming decades.  Children – according to a Human Rights Watch report – last week that almost all EdTech products risked or infringed on children’s rights.  Amazon warehouse workers and Uber drivers, long been treated as a slaves to the algorithm, but as a result of the pandemic the market for workplace surveillance is now pumped on steroids. Workers have little choice but to acquiesce if they want to keep their jobs.

Yet privacy is alive and well and always will be for the powerful. Despite congressional majorities and state attorneys-generals efforts – still no one can get at Donald Trump’s tax returns. Mark Zuckerberg would not tell Congress what hotel he is staying in, although his social media empire is built on the collection of such trivial data concerning everyone else. Independent audit and transparency are always resisted by the powerful, with a plethora of tolls like NDAs, non-compete clauses and trade secrets wielded by smart lawyers ready to intimidate and exhaust any potential challengers, such that the mere threat of litigation is enough to chill most of them into silent. 

Hoepman’s proposition about using tech to enhance privacy may well be valid for those of us who are familiar with VPNs, encrypted messaging apps etc. However, it is not fair to place the onus on everyone to be become so tech savvy. The very act of collecting and using data is an indication of power. Kate Crawford has been concerned specifically about AI, but her call to reason is equally apt for any processing of data, especially at scale, given the computing power, human resources and other administrative and legal apparatus that are required.

‘Datasets [in AI] are never raw materials to feed algorithms: they are inherently political interventions. The entire practice of harvesting data, categorizing and labelling, it, and then using it to train systems is a form of politics.’

Hoepman recognises the inherent risk in the assumption of a rational consumer in a reality of power imbalances. In the concentrated markets of the digital economy, where the dominant business model is tracking and profiling, how can there be genuine choice? If there is no choice, how can there be freely-given consent, the only valid form of consent under the GDPR?  This is where the legal regimes of privacy and competition intersect, yet regulators still have not managed to converge on a solution.

Wizards of Oz

Second, what actually motivates all this rampant data harvesting? The book selects a random Android mobile app, the ‘Brightest Flash LED’, which hoovers up all manner of data utterly irrelevant to the functioning of a torch – location, address book and microphone recordings – as an example of the indispensable, privacy-invasive business models. What is not clear is why configure such an app to collect so much personal data in the first place. Typically these data get funnelled to Google and Facebook. Is their end goal really manipulating our brains to buy more stuff? (Or as the EU AI Act authors put it, to ‘deploy subliminal techniques beyond a person’s consciousness in order to materially distort a person’s behaviour in a manner that causes or is likely to cause that person or another person physical or psychological harm’.)  Privacy is Hard cites Carole Cadwalladr’s contention that the abuses of Cambridge Analytica tipped the 2016 referendum towards Brexit and, of course, I would like this to be true. Unfortunately, this is an easy answer to a complex question, which runs the risk of doing the free marketing of the surveillance capitalists.

Hoepman asserts early in the book that Google, Facebook “and the like … know everything we share on their platform…”  Do they really, or have they just managed to get advertisers to believe them?  The author later reflects that thinking about surveillance not enough and that ‘perhaps we need to shift our attention and focus on the underlying capitalist premise instead’ (including a reference to Morosov’s stunning dissection of Zuboff’s Surveillance Capitalism), but here the thought is left tantalisingly hanging in the air. (In fact, there is no need to assume this is a typically capitalist malaise; a digitalised feudal system would be driven by the same dynamics of power.)  He also recognises that Google and Facebook are basically advertising companies. Facebook monopolises (for now) the inventory for social media eyeballs with its various other tracking techniques so ubiquitous so you cannot escape them whether or not you are a subscriber to one of their services.  Google has by far the biggest search engine and is powerful on both the demand and supply side of the adtech ecosystem.  These companies certainly, therefore, want your data. The question, left unexplored, is whether they are monetising the actual data or rather the belief in their claim that their data gives them unrivalled powers of predicting future behaviour on the basis of past behaviour. Such claims cannot be verified because there is no transparency about what the companies really do with the data.  We all see ‘personalised ads’ for the same things we have only just purchased. Now internal documents obtained by Motherboard suggest that Facebook themselves do not know what data they have or what happens to them.  Most data is just not used at all; most companies do not have the imagination or computing power to get value from it.  Surveillance capitalism might more accurately be described as industrial-scale data hoarding: hoarding that requires the burning of fossil fuels. In such a scenario, the Emperor has no clothes, and it is all one massive scam.  

The simulacra of privacy

Furthermore, for all the discussion of using tech to hide your data, it would be a shame to reduce all digital relationships to zero trust by default. This would not reflect how we are in real life. Hoepman indeed cites Helen Nissenbaum’s work on the contextual nature of privacy. If you are in a relationship of trust and respect then you have few qualms about exposing yourself; otherwise, there are gradations in the privacy human dignity demands.  These norms have been unsettled by the growing interpolation of technology, and the relatively few entities in control of that technology, into social and economic life, such that you have no choice but to expose yourself to those entities, even though you do not trust them. Can any systems redesign be expected address this? I am not sure. Hoepman concludes that privacy could be restored if we could ‘redo the plumbing’. Perhaps we should focus on repairing socio-economic ties and diminishing the depredations of gigantic faceless intermediaries. 

And yet, the Privacy is Hard is timelier than its author could have anticipated. It treats privacy as more or less coterminous with the processing of personal data, which is not a problem for most people except the most unreconstructed European legal geek. But here are infringements of privacy that make data protection violations look trivial. In the US, there are now well-founded fears of a real-life spillover when the Supreme Court as expected overturns Roe v Wade, which rests on a putative but contested right to privacy. Red States are lining up to criminalise not just abortion, but also contraception and abortion support. So all of the personal data harvested by platforms, apps and websites will become susceptible to disclosure to law enforcement, including data concerning women worried about what to do about pregnancy or fertility.

The market for privacy as commodity may have come to resemble Baudrillard’s simulacrum, a representation that bears ‘no relation to any reality whatsoever’. But the privacy-protecting tech solutions Hoepman promotes, like encryption and for minimising data retention, could well suddenly become essential rather than nice-to-haves. Market-driven and voluntary privacy-by-design could become the only safeguard in the United States against the intrusion of the state into a person’s most intimate sphere – if only everyone were able to avail themselves of it .

The age of (trying to regulate) surveillance capitalism

For a while now the most interesting discussions about tech and privacy have been happening in the United States. In politics, the US Congress in 2018-2020 was awash with proposals for a federal privacy law. Privacy academia is also in full spate, and there are few more incisive and erudite contributors than Julie Cohen, who in her latest blockbuster suggests what such a federal privacy law should and should not do if it is to address what she calls the ‘dysfunctions of the networked information economy’. 

Privacy has been viewed through the prism of individual rights, even if there is a general acceptance that individual rights and freedoms are social goods. Cohen focuses on the limitations of this perspective. ‘Atomistic, post hoc assertions of individual control rights … cannot meaningfully discipline networked processes that operate at scale. Nor can they reshape earlier decisions about the design of algorithms and user interfaces.’  She takes aim especially at ‘The continuing optimism about consent-based approaches to privacy governance’, which she finds ‘mystifying, because the deficiencies of such approaches are well known and relatively intractable.’  The landscape is too complex and manipulation too easy for consent to be the answer. Tech firms can game user control rights with ‘synthetic data’ that tends to lie outside the framework. She (like Paul Schwarz much earlier) sees the consent requirement as akin to property rights, which have failed to safeguard the interests of the marginalized and poor. ‘It makes no sense whatsoever where networked, large-scale processes are involved.’ 

Novel and creative apparatuses that aim to compensate for the limitations of individual control, like ‘user governed data cooperatives’ or ‘automated consent management panels’, might work for ‘smaller, more homogeneous communities’ where it is obvious which resources are to be protected. But their effect, Cohen argues, is to ‘turn consent into a fig leaf’, useful only ‘to achieve compliance with a regime that requires symbols of atomistic accountability.’

Moreover, the hoped for ‘trust’ and ‘duty of care’ often touted – festishised even – as an all-purpose salve, a sort of soft law social contract between commercial digital players and society, would never be considered sufficient for banks, pharma or insurance companies. We are too aware of the risks in those fields to allow such latitude. In the digital economy, ‘dominant actors’ could only be expected to change their surveillance-based business models, if their ‘corporate ownership and control structures’ as well as ‘licensed flows of data’ were disrupted. They set the terms and protocols for data use that developers and ‘marginal actors in networked information ecosystems’ – from media companies to white supremacists – have every incentive to follow.

Until recently, there were vast swathes of the planet without any general privacy and data protection laws. Those regulatory deserts are now dwindling, with Brazil and soon India adopting rules that will extend legal protection to billions of people. (See especially the work of Gabriela Zanfir-Fortuna at the FPF in tracking these developments.)  Authoritarian China is also aping GDPR-styles rules, reflecting popular exasperation with their own surveillance capitalists, though the rules are clearly subservient to the far greater priority of state coercion and control. Ironically, the United States could soon become the global outlier.

Cohen however critiques data protection on grounds that it ‘relies on prudential obligations’ and ‘invites death by a thousand cuts’, unable to scale to today’s tech leviathans and systemic privacy depredations, reducing well-meaning laws to ‘an exercise in managerial box-checking’. She takes the GDPR to task, for example, for requiring data protection-by-design without specifying what the design ought to be: ‘There is a hole at the center where substantive standards ought to be—and precisely for that reason, data protection regulators often rely on alleged disclosure violations as vehicles for their enforcement actions, reflexively reaching back for atomistic governance via user control rights as the path of least resistance.’  She notes that the proposed privacy bills all shy away from regulating government use of data, so they would not remedy the shortcomings identify in the two CJEU Schrems judgments. Nor, for that matter, do they seem interested in the principle of purpose limitation in the use of personal data. It is unclear, however, what Cohen would prefer to see. The GDPR and its fellow travellers are criticised for their tendency towards over-prescriptiveness, but at their core is an attempt less to micromanage business practices, even less to outlaw business models, but instead to make them accountable for whatever they choose to do with the data they control. Consent is not, in fact, at the heart of the GDPR.

Cohen is at her most convincing when highlighting the gap between legislative promise and delivery of outcome. ‘If one wants to appear reformist while moving the needle only very slightly—one confers authority to make rules and bring enforcement actions, and then one waits to reap the predicted beneficial results.’ This model has broken down, she says. It’s like a car manufacturer or (at least before the 2007-8 crisis) mortgage lender that fawns over its prospective customers before neglecting them after the purchase has been made and whenever problems arise. The ruthless injunction to ‘Always Be Closing’ in David Mamet’s Glengarry Glen Ross becomes Always Be Legislating, and let the enforcement look after itself. ‘Enforcement practice,’ however, ‘has largely devolved into standard-form consent decree practice, creating processes of legal endogeneity that simultaneously internalize and dilute substantive mandates.’ The dirty secret of these enforcement strategies is that Big Tech’s excesses remain and have if anything got worse during their implementation.

These ‘litigation-centered enforcement mechanisms’ can only ever be as effective as access to justice which, at least in the United States, has diminished over the years as standing and class-action have been curtailed. Public enforcers have been underfunded and become increasingly risk-averse.  Infringements of privacy being systemic, Cohen thinks it is insufficient to single out individual actions of bad actors pour encourager les autres, as the effect is to ‘validate the mainstream of current conduct rather than meaningfully shifting its center of gravity’.  The behaviour regulators seek to penalise and reform is so profitable that occasional enforcement can be planned for, and absorbed, by the bigger companies as an acceptable business overhead – Cohen cites the size of the 2019 FTC $5bn settlement with Facebook which broke records but changed nothing.

What is the alternative? Privacy regulation should move out of its silo and borrow from post-crisis financial regulation’s ‘operating requirements for auditing, benchmarking, and stress testing,’ and for ‘public-facing transparency about information-handling practices’. ‘Honoring the public’s right to know requires a less deferential approach to the secrecy claims that have become endemic in the networked information era.’ Instead of drop-in-the-ocean civil fines, public enforcers should be able to reverse the incentive bias by disgorging profits accruing from unlawful activity, and return those ill-gotten gains not only to the individuals affected but also plough them back into the public enforcement budget. Where violations with knowledge and intent have been proven, it should be possible to target the personal wealth of those accountable board members, and especially where they happen also to be majority shareholders. These are means for internalizing the externalities of their immensely profitable enterprises.

Julie Cohen’s remedies echoed and coincided with those proposed by the apparently chastened former CFO of Goldman Sachs. Writing in The Economist, Martin Chavez said, ‘Just as CCAR places limits on the leverage in derivatives portfolios by imposing capital requirements, digital regulators would insist that platforms constrain the algorithmic amplification of outrage and emotional resonance by limiting the propagation of viral content. Regulators would base their rules on a deep and shared understanding between the regulator and the regulated of how digital companies make money, just as banks simulate their complex chain of interlinked businesses into the future in a stress test.’ Chavez concedes that the Dodd Frank Act and its ilk have hardly ushered in golden era of financial morality, but he is not alone in wondering why what’s good for the banking goose cannot equally be served up to the digital gander.  

These are worthy and thoughtful additions to the debate on how to forestall the failure of tech companies that have become ‘too big to fail’, by means of a concoction of intricate technocratic rules that only the biggest can be expected to be able to comply with. For engagement with the underlying questions of growing inequality, digital or otherwise, and of when a company may simply be ‘too big’ for a purported democracy of human rights and equality before the law, we must look elsewhere.

Un omaggio

This reflection on the life of my boss, Giovanni Buttarelli, was originally posted on https://iapp.org/resources/article/memoriam-giovanni-buttarelli/ and has been repatriated here, a year to the day since he passed away.

I don’t have many photos of me and Giovanni. Just as well, because his svelte poise only threw into ever-starker relief the balding squatness of my advancing years. I found one, though, captured when with the assistance of my (surly) two year old, I had rushed a document for his signature before he went through the security gates at Brussels airport.

He was entitled to be confident, and not only because of his debonair demeanour. He had a global fan base, the extent of which we are only beginning to fathom, as the tributes come flooding in. But a more important endowment was his self-doubt. He concealed it expertly, but I believe it was what made him intensely sensitive to what other people thought and said, and what gave him his insatiable hunger always to do more and do better.

Thanks to him, our floor was suffused with the smell of proper coffee. “No coffee, no meeting,” he warned his personal assistant, arriving in the office one morning. On another occasion, she was scandalised when I said he had asked me for another tazzina. That would be his fifth of the morning, she said. No way. I will make him a decaffeinato.  

“We need to be more communicative,” was one of his refrains. But not too communicative: “We have to speak in an institutional language,” he would say if my drafts got too fruity. Overall, he wanted data protection to descend from its ivory tower, and demonstrate its relevance and value in the real world. He likened arid legal opinions to recitals of the Mysteries of the Rosary, which he would recall in monotone solemnity:

Nel primo mistero gaudioso ricordiamo l’annunciazione dell’Angelo a Maria Vergine…

Nel secondo mistero gaudioso ricordiamo la visita di Maria Santissima a Santa Elisabetta...

He worried about parochial jargon being a barrier for reaching out to non-specialists. Data protection was about showing respect for people. It was not an absolute right. It does not block technological progress or public safety or other things that society cares about; it’s essential for ensuring these things are done responsibly and sustainably. Take risks, don’t hide behind consent, but own those risks. Data protection lawyers, Giovanni would say, should not be like the Trappist monk from Non ci resta che piangere who appears from nowhere to harangue the unsuspecting Massimo Troisi with a reminder of his mortality: Ricordati che devi morire! Engage with the policy and be persuasive.

Giovanni’s priority as Supervisor was to be more ‘conversant with technology’. So in the first year of his mandate he went to Silicon Valley. He left convinced that the biggest challenge was not compliance with arcane data protection rules. Rather it was the assumption, across boardrooms, venture capitalists and garage start-ups, that the only way for digital services to be profitable was to track people, profile and target them. Giovanni began to pepper his talks and articles with the mantra of ‘the dominant business model’, before it became cool to do so. Only latterly did he have the chance to connect with fellow traveller Shoshana Zuboff when at the beginning of this year she came to Brussels to unveil her monumental exposé of “Surveillance Capitalism”. At his last public event, a high-powered powwow on privacy and competition that he co-hosted with the German Federal data protection commissioner, each of his fellow panellists echoed the phrase. Giovanni’s diagnosis was no longer eccentric and ‘out there’, it had become the new orthodoxy.

Giovanni was a policy entrepreneur. He spotted opportunities and threats on the horizon before others did. In 2015, with the data protection world absorbed with the GDPR negotiations and the judgments in Schrems and Google Spain, he pitched the concept of ‘digital ethics’. He argued that artificial intelligence and smart-this-that-and-everything challenged not so much privacy as the basic, universal and inviolable right to human dignity. He got the world talking about ethics and technology at the 2018 international conference of privacy commissioners. Now chatter about “ethics and AI” has become so commonplace it risks descending into banality.

Giovanni, following his predecessor Peter Hustinx, saw – again long before it became cool – that privacy in the age of digitisation was inseparable from market power and therefore competition enforcement. It was in his garage, setting off to deliver a speech to antitrust lawyers, that the idea of a ‘Digital Clearinghouse’ was hatched. It would bring together all willing enforcement agencies with responsibilities for digital markets. By July this year, he would share a platform with an outgoing Secretary General of the European Commission who called on multiple regulators to act as if they were single regulators, and predicted that the convergence between privacy and competition would be ‘a running theme’ for the next five years.

Giovanni was alone in 2017 in telling us that the novel obsession with ‘fake news’ pointed to an underlying, systemic data protection problem. Data protection was not just about the individual – it was a core safeguard for societal cohesion and democracy itself. It is now cool to say this. There is no greater compliment to the man than that he has others now reading from his playbook.

It was on such broad canvasses that Giovanni painted. But what most engaged him were the finer details of legal texts; he was a magistrate probing arguments, teasing out potential ramifications. Watching him in action from close quarters was a great privilege. He would not impose his views but lead you through legal quandaries to a satisfactory conclusion. For several brain-sapping weeks in early summer 2015, he convened marathon sessions with the EDPS’s best lawyers to pick over three competing versions of GDPR, by then into its fourth year of negotiations (the Commission’s original proposal, the European Parliament and Council amendments), along with the EDPS’s own extensive opinion from 2012. The result was a GDPR mobile app juxtaposing the different texts with the EDPS’s advice on how to resolve the discrepancies.

Privacy was his last profession and he practiced what he preached. He did not burden his closest colleagues with his personal affairs. And he never meddled with mine, except after I told him my wife was expecting again and he advised us to go buy ourselves a TV. There was no need for him to pry, because I know he cared.

The world was robbed of Giovanni too soon. His mind was awash with ideas. Recently he was finalising a ‘manifesto’* for the future of privacy. In one of his last messages to me, he said, to paraphrase, “This had better be the best document of all time. I want the whole world talking about it. Balls of steel.” Then, after a pause, “You too are playing with your future with this document. Get it wrong and you will end up in la merda.” In our final conversations, we spoke about the climate crisis and how technology instead of being part of the solution had instead become part of the problem with its carbon-emitting data-madness and reckless natural resource extraction. He had plans to visit China.

I wish I could remember more. I wish I had written down all his obscure Latin phrases and his tawdry Roman dialect expressions. But he has left us with plenty to be getting on with. So many words. Parole, parole. Some receding, others lingering, like ghosts at cockcrow.

Grazie, caro Consigliere.

* Giovanni Buttarelli’s manifesto, Privacy 2030: A New Vision for Europe, was published posthumously in November 2019.

‘Privacy 2030’, Giovanni Buttarelli’s ‘manifesto’

manifMy presentation of ‘Privacy 2030: A vision for Europe’ at IAPP Data Protection Congress, Brussels, 21 November 2019

What would Giovanni have said if he was still with us?

Well, first of all he would have noted that this event has coincided precisely with the
future-fictional setting of Bladerunner.

Now here we are, Brussels, 21 November 2019, and reports of imminent tech dystopia
are greatly exaggerated; at least in Belgium.

Shortly after Giovanni asked me to work for him, I would learn that there were three
moments in the now very congested privacy calendar where he would want to shake
things up, say something new – qualcosa di fico.

Two of these moments were IAPP moments: in springtime Washington DC around time
the cherry blossom briefly adorn the National Mall; and here in grey, soggy, autumnal
Brussels, when the people take refuge in cafes and strong dark beers.

Trevor Hughes and Omer Tene would always offer Giovanni a platform for his insights,
and it spoke to a deep mutual affection between them.

A year ago, he was here and announced his intention to publish a manifesto on the
future of privacy and digital society, looking beyond the GDPR.

So he would be grateful to the IAPP for publishing this week ‘Privacy 2030: A Vision for
Europe’, and to his seven eminent ‘fellow travellers’ – Omer, Maria Farrell, Malavika
Jayaram, Jules Polentsky, Rocco Panetta, Marc Rotenberg and Shoshana Zuboff – for
contributing their own reflections on the big theme.

He would have been moved that the first lecture in his memory would be delivered by
Commissioner – soon to be Executive Vice President Margrethe Vestager.

There was a special chemistry between these two great thought leaders, though they
come from very different backgrounds, at opposite ends of Europe. It is clear that much
of what he believed in will be continued by her as she tackles the huge challenges ahead.

Giovanni’s three children Serena, Gianluca and Eleonora are with us today, and in their
charming company you sense their father’s calmness, lucidity and Sphinx-like
inscrutability, in this still traumatic period of coming to terms with such a loss.

Among the myriad Buttarelli awards, foundations and even meeting rooms being
proposed, it is Serena, Gianluca and Eleonora who will be the ultimate custodians of
their father’s legacy. And I want to thank them for their moral support in this project.

The manifesto is a meditation on the big challenges and a call to action.

It talks about the power of big companies and governments to do things with data and
technology. And it talks about the vulnerabilities of children, low paid workers,
migrants who have those things done to them.

It makes the link between digitisation and the environmental crisis. Our insatiable
enthusiasm for throwaway devices and generating more and more data is actually
increasing our carbon and biodiversity footprint precisely at the moment we are
supposed to be urgently reducing it.

It proposes ways the EU can empower people and address genuine social and
environmental problems.

It reflects what you could call ‘late style’ Buttarelli.

‘Late style’ in an artist is when she ‘constructs an alternative universe which somehow
helps us understood the world we live in now.’ If you read his blogposts, speeches and interviews over recent years, you see that
Giovanni wanted to call out things that were wrong.

But, at the same time, he retained a fervent pragmatic optimism for an alternative
future, and especially for the role of Europe in constructing it.
It may be the fate of most manifestos never to be implemented – but the good ones will
at least get people talking.

And indeed, the one instruction he gave me was to be provocative.
He warned that unless it made the cover of Time magazine, I would have failed and
could pack up my belongings and leave the building.
Well, Giovanni, if you’re listening it has not yet made Time magazine, though it has
featured in the latest MLEX data and security daily briefing. Baby steps.

So this is my last ghost-written tribute to Giovanni. And the best tribute I can offer him
is to help keep you talking about him long after he is gone.

I hope his manifesto will do just that.