Risk of Risks

On January 1st 2020, the California Consumer Privacy Act came into effect. The economy of California is the largest in the United States and ranked as the world’s fifth largest economy. It is also home to some of the world’s most powerful tech companies.

Data Privacy is already elevated as a board level issue and a big ticket deal with threats of legal and financial sanctions, risk to reputation and brand.

Reporting by the IAPP Europe (January 2020) highlights: Today’s lightning advances in technology in the fields of AI, automation and cloud services will herald in fundamental and complex changes in the way data protection unfolds. The organization that does not adjust to the new paradigm may not see the next decade.

What’s in store in this new decade?

A continued key trend in the coming years will remain third-party risk management. More robust outsourcing, vendor management and supply chain solutions in a pervasive digital age will be tactical and key to organizational strategy; therein lies considerable risk and exposure. Moreover, where breaches at the larger multinationals have dominated the landscape in recent times, their third-party relationships may prove more vulnerable in the coming years. Defending and proofing those supply chains in continual fashion will be critical. 

Acknowledgement: IAPP Europe Data Protection Digest January 2020

I am a member of IAPP

Is Artificial Intelligence on a collision course with data protection?

Few people know their rights under existing data protection laws. Here is a reminder:

The last paragraph is significant as it refers to the capabilities of AI to process personal data. Those that use AI are accountable under GDPR and Data Protection Act 2018 (UK).

I was interested to read a document: Resolution of the 97th Conference of the Independent Federal and State Data Protection Supervisory Authorities of Germany Hambach Castle, April 3, 2019.

The resolution (particular to Germany) records seven data protection requirements.

  1. AI must not turn human beings into objects
  2. AI may only be used for constitutionally legitimate purposes and may not abrogate the requirement of purpose limitation
  3. AI must be transparent, comprehensible and explainable
  4. AI must avoid discrimination
  5. The principle of data minimisation applies to AI
  6. AI needs responsibility
  7. AI requires technical and organisational standards

This is a resolution has no legal standing. It does provide a pointer to considerations that those using AI should consider to remain compliant with their data protection responsibilities.

You can read the full resolution when you click here

Artificial Intelligence and Risk

The surge in interest in AI has generated the need to know the risks and how they might be insured. Now that does not occur to many people as they rush headlong into the AI jungle. It does occur to those responsible for governance and compliance and they look to organisations like Zurich Municipal for guidance.

So what is the point of view of insurers and their appetite to underwrite this risk? It is conditional. No surprise there.

From Zurich Municipal Report Page 3

The thing is AI is very challenging from a governance perspective. Why?

To know the answer – read the report it is only 7 pages and you can download it when you click here.

Boardroom priorities

You are in sales and focus on enterprise sized businesses (those with more than 250 employees), do you wonder what the priorities are of those you want to influence?

Here is a picture for you to digest. Good news for recruiters assisting with talent and succession. Also good news for tech vendors competent in Digital innovation and Cyber security.

Strategy sits between Digital innovation and Cyber security. What does that conversation look like particularly as 24% of respondents to the survey think DIGITAL COMPETENCY in the boardroom is a skills priority?

If you want a primer for that conversation click here

Source Predicting The Unpredictable – Harvey Nash Board Report 2020

The top three competencies sought by boards are


Strategy (presume that is linked to Technology/Digital?)

Sector expertise

Now if you are in tech sales what will you bring to the conversation?

Click here for fancy infographic

Click here to obtain a copy of the report

GDPR – IAPP update – Germany

It has been a busy couple of months for the German regulatory community. In November, a report on the state of play of GDPR implementation — and the experience thus far — was drawn up by the Conference of Independent German Federal and State Data Protection Supervisory Authorities and adopted at its 98th Conference. The Datenschutzkonferenz (or DSK, as it is better known) is the umbrella structure that comprises all the state regulatory authorities in Germany, as well as the federal authority, and is tasked with issuing uniform and official resolutions, guidance and statements reflecting national and European law.

The publication of this report is quite the task as it needs to encompass a high level of consistency, as well as consensual opinion on the evaluation and review of GDPR implementation to date. This must be done across a large group of regulatory bodies as required in accordance with Article 97 of the GDPR. Moreover, the aim of such a review is ultimately to derive suggestions and recommendations for improvements to ensure a more optimal implementation of the regulation. I am happy to say, that for all the non-Germanophone privacy pros out there, this report now exists in English and can be found here. The findings are too many to mention here, but the DSK broadly shares the opinion that the GDPR’s regulatory concept and objectives have been largely successful to date in the pursuit of enhancing the protection of fundamental rights and contributing to the creation of the Digital Single Market in the EU.

Interestingly, in annex to the GDPR report, there is also the Hambach Declaration on Artificial Intelligence, a resolution also adopted at the 97th DSK Conference. It basically treats seven key data protection principles when addressing data protection in the field of artificial intelligence and automated decision-making: informing the debate; informing a digital future.

This report also comes on the heels of the DSK releasing GDPR fining guidelines in late October. All this at a time when there has been a growing entrenched public perception centered around the potential for high fines associated with GDPR enforcement. Raising privacy and data awareness comes with an imperative for both regulatory authorities and businesses alike; the work must be done. German authorities have already started to apply the DSK-fining methodology. The Berlin data protection authority — which also took the lead in developing the fining framework — recently issued a fine of 14.5M euros using the five-step process design. The case itself relates to excessive retention of personal data by a real estate company and its failure to implement privacy-by-design principles. What is generally accepted is that the DSK framework is aggressive in that the current model will almost certainly impose higher fines than expected and controversially more so for organizations with high revenues.

From an EU perspective, the EDPB is tasked with ensuring the consistent application of the GDPR throughout the EU. Importantly, it is expected to adopt a harmonized fining methodology, but no timeline has been identified as yet for this. In the interim, national frameworks — German and other — will remain the relevant methodologies in their jurisdictions. This may lead to some very colorful enforcement actions and maybe some testy legal challenges, too. 

I am a member of IAPP and credit the source of this article (as below) :

Paul Jordan
Managing Director
IAPP Europe

Digital Operating Model

A new day, a new book. So I have been writing with an expert team about the Digital Operating Model.

If you were routed from my post ‘Boardroom Priorities‘ then the book will prime your conversations for 4 of the 5 top ranked topics on the minds of senior executives.

The book was commissioned by Microsoft and launched at Microsoft Ignite Orlando 4th November 2019 where 2000 copies were distributed.

The book is now available as a free download from microsoft.com and you can gain access when you click here.

Also available in book format

Digital mash-up

So I randomly find an article by McKinsey Digital while researching the use of digital in charities. That article headline is ‘How digital is changing leadership roles and responsibilities’. You can access the article when you click here.

I read on and find this passage: ‘having more digital leaders at the CxO-level doesn’t necessarily make the technology function of an organization better. Appointing a chief digital officer doesn’t necessarily make a company more effective in developing and deploying digital solutions. Even more striking, adding these new roles without an aligned operating model can actually lead to more confusion, power struggles, and a negative effect on the company’s overall IT performance.’

I pick out in bold what caught my attention. Why? This is exactly what I wrote about with my co-authors in a book titled ‘Thinking of Building a Microsoft Cloud Operating Model? Ask the Smart Questions.

Curious to know why you need an operating model then click here and all will be revealed.

Cat image has nothing to do with this blog but I have noticed cat images get a good following. Prove me right!

Turning ideas into cash

Every entrepreneurs challenge is to turn an idea into cash. The failure rate is high and particularly in tech as there is a great deal of competition. Oh well, that is not gonna stop you, right?

My work with European Union Horizon2020 CloudWatch2 project provided me insight to the characteristics of R&I funded projects and the difficulty exiting product development into commercial realisation.

Moment of clarity

In a rare moment of clarity I pieced together an idea to combine Technology Readiness Levels (TRL) that are commonly used to track progress of a R&I project read at https://en.wikipedia.org/wiki/Technology_readiness_level with something I called Market Readiness Levels (MRL) (for which there is no Wikipedia reference).


The resulting conjoining of TRL and MRL created a methodology known as MTRL to control the technology and commercial outcomes of a R&I project. This was used successfully with a number of CloudWatch2 projects.

More information at www.mymtrl.eu

The ultimate outcome of this invention was the gift of MTRL to Oxford University Innovations. The invention and application is documented and accessible when you click here. Scroll to the bottom of the page for the report or click here to access.

GDPR – lessons

My role as Co-Founder of Digi-Board includes responsibility for our compliance with #GDPR.  I trained at Henley Business School under Prof. Ardi Kolah and learned that a focus on compliance alone is the wrong way to put in place ‘privacy by design’.

Digi-Board is a customer of GoCardless to process online payments and I republish below an article (June 2019) sharing practical real world experience of GDPR from the Data Privacy Officer of GoCardless.  A great read for senior management and data privacy professionals.


How do you comply with every prescriptive element of GDPR, and meet the principles of the regulation, in a way that minimises unnecessary distraction from your core business? In short: how do you create ‘privacy by design’?

Few companies hire enough people with ‘privacy’ in their job titles to meet all the requirements of GDPR. It follows then that if privacy sits on top of normal business processes, it won’t scale.

With that in mind, here are five things we’ve learned over the last year about embedding privacy in the business.

1. Speak the language of the business

We didn’t get this right the first time around. To build our GDPR-compliant register of processing activities, we used questionnaires sent out from an off-the-shelf tool.

We asked all our data processing teams a lot of questions – all the wrong ones, as it turns out. “Can you identify a lawful basis of processing for this activity?” “How are you meeting the principle of purpose limitation for this activity?”

We knew we had gotten it wrong when we looked at our GDPR-compliant register and saw dozens of different variations on the term “not sure”!

In v2.0, we took a different tack. We asked the business only the questions we knew they could answer, like – what are you trying to do with the data, what data do you need to do it, what systems help you accomplish it. As a result, our updated register is clear, actionable and easy to keep up-to-date.

2. Be where the business is

We can’t have a privacy expert in every meeting – there aren’t enough of us, and even if we could be everywhere all the time, it would just slow things down.

But that means almost every GoCardless employee will at some point have to make decisions that have a privacy impact . . . like scoping a new product, choosing a new supplier, or training a new data model.

I have seen even very well-designed privacy programmes fail when they just aren’t adopted by the business.

When people are asked to step out of their day-to-day role, they’ll tend to take the path of least resistance. It’s not because they don’t want to do the right thing! But even if they understand what we need them to do (and see point 1), the process we’ve created might just make it hard for them to do it.

Privacy processes can’t stand alone, they need to be part of business as usual. Our head of data puts it nicely: we need to make it really easy for people to do the right thing and really hard for them to do the wrong thing. Which leads to…

3. Automate as much as possible

As the privacy field matures, we’re starting to see tools offering out of the box automation and compliance.

The problem with many of these is that they offer a standalone experience: a tool for managing data processing agreements that doesn’t sit within a broader supplier contracting function; a tool for tracking data subject access requests that can’t be used by Support, a data protection impact assessment that isn’t part of the product development lifecycle.

Privacy processes that don’t fit within a broader business context will take people out of their day-to-day. Then, if they’re done at all, they aren’t done well.

We’ve found it more effective to start with the business – what does their day-to-day look like? What documents do they create, what tools do they use, what are their decision-making points?

Those are opportunities to ask the right questions at the right time, and to be able to escalate to the privacy team where necessary.

For example, when our data teams build a new feature, they’re prompted from within the process itself to identify a business purpose from our (now clean and up-to-date) GDPR register. If a business purpose isn’t present, the model can’t be built. And if there isn’t a suitable purpose listed in the register, then it’s an indication that something new is happening that needs privacy review.

The process also gives us an audit trail that we can test to make sure the right decisions are being made.

4. But beware of silver bullets

Automating privacy processes can end up working against you. Some companies make programmes scalable using checklists. But this approach can backfire.

Layers of bureaucracy badly applied disempower employees, keep them from being accountable for privacy impacts, and lead to unexpected risks (“this wasn’t on the checklist, so it must not be a problem”).

We’ve been careful to keep our processes simple, and focused heavily on training and guidance for our teams.

For example, we’ve launched training for our product managers and functional leads, giving them the resources to think about building privacy into our products from start to finish.

One resource has been a particularly useful part of our product scoping documents and privacy impact assessments: A tailored taxonomy of privacy risks that helps guide thoughtful conversations about minimising unintended or unlawful consequences from the use of personal data.

5. Listen to what your programmes tell you

GDPR allows data subjects to exercise their rights with the data controller. The two rights requests we see most often are subject access requests and subject deletion requests.

Early on, we made a decision that subject rights requests don’t go straight to our privacy team. They are handled first by our customer support agents using their own tools (Zendesk macros and our Support Hub), before they go to our rights request software to track to completion.

This has been very successful for two reasons: First, these requests don’t happen in isolation. Sending the requests to Support first brings them to the people who are best trained to identify and resolve the underlying problem (supported of course by training and resources from the privacy team).

Second, our Support team has an enormous amount of experience with metrics and KPIs. Using their tools allows us to keep close track of SARs as well as other complaints, questions and incidents.

How quickly and efficiently we can handle an access or deletion request tells us a lot about the health of our privacy programme, and tracking these metrics is one of our key risk indicators.

We track other risk indicators too, like marketing unsubscribe rates, supplier risk ratings and time to respond to data-related legal tickets. These tell us a lot about where the gaps are and allow us to optimise.

That feedback allows us to make constant incremental improvements to the programme, and also helps us meet the principle of Accountability, the heart of GDPR.

Credit to GoCardless.

It’s a digital world – so what?

If you have bootstrapped a business, as I have more than once, then you will know that you just have to embrace #digital technology.  At a micro scale that is easy with plentiful applications and services a click away in the #cloud.

When you are a big business with a legacy of technology accumulated over many decades and with people and processes linked to the evolution of that legacy it is much harder to adapt.

Couple of thoughts on this.

Spotted this article from heavy hitters McKinsey and a worthwhile read.

Of course there is always a gulf between the theory and practice and that is why I co-wrote a book Thinking of..Building a Microsoft Cloud Operating Model Ask the Smart Questions at www.cloudoperatingmodel.com that digs deep in assisting the business and technology teams to collaborate on ‘making it happen’ as McKinsey describe.

I have 5 copies of the book to distribute to the first five people to respond to this post.  All you have to do is to reply to this question by email to frank@frankbennett.co.uk by 15 June 2019.

Q.  What is the difference between agile and Agile?

There is no right or wrong answer!