Clinical Commissioning Groups’ Sale of Patient Data

Arguably the most significant reform of the Health and Social Care Act 2012  is the introduction of a National Health Service Commissioning Board, which will supervise local primary care clinical commissioning groups. These clinical commissioning groups will replace primary healthcare trusts. Primary healthcare providers, particularly GPs, were always the gatekeepers to the National Health Service, but under the 2012 reforms, they will also be the principal budget holders under these clinical commissioning groups, buying secondary care in a quasi-competitive open market.

Under ss14X and 14Y of the National Health Service Act 2006, following wholesale amendment to that 2006 Act by the Health and Social Care Act 2012, clinical commissioning groups will have separate statutory duties to promote innovation and research. The groups also have a duty to carry out their functions effectively, efficiently and economically (s14Q).

To assist clinical commissioning groups in their extensive duties set out in Part 2 of the 2006 Act, they will have the power to raise income (a new power under s14Z5 of the 2006 Act), by doing anything the Secretary of State can do under ss7(2)(a), (b) and (e) to (h) of the Health and Medicines Act 1998, either alone or with other groups. In particular, s7(2)(f) will permit the groups “to develop and exploit ideas and exploit intellectual property.”

Whilst it may therefore be a stretch to argue that clinical commissioning groups have a duty to exploit the value there may be in patient data, it is clear that to exploit patient data is well within their duties and powers under the 2006 Act. In addition, disclosure of information “made for the purpose of facilitating the exercise of any of the clinical commissioning group’s functions” is explicitly permitted by the 2006 Act (s14Z23(1)(f)).

This only leaves the Data Protection Act 1998 to consider. Could clinical commissioning groups sell patient data under the Data Protection Act 1998, with or without the consent of patients themselves?

This is an interesting question. One answer is that it would be possible. In order to process patient data, the groups would have to meet one of the conditions for processing sensitive personal data (as defined in the Data Protection Act 1998).

It is arguable that there is the statutory basis for selling the data, being to comply with commissioning groups’ statutory duties to promote innovation and research, and to raise income in order to exercise their statutory duties effectively, efficiently and economically. As there is a statutory basis, the selling of patient date could be argued to be “necessary for the exercise of functions conferred by or under statute” – which is one of the conditions for which the processing of sensitive personal data is permitted under the Data Protection Act 1998 (paragraph 7(1)(b) of Schedule 3 of the Data Protection Act 1998). This does not require patients’ consent.

In addition, processing for research purposes is itself a permitted purpose under the Data Protection Act 1998 (paragraph 10 of Schedule 3 of the Data Protection Act 1998, and paragraph 9 of the Schedule to the Data Protection (Processing of Sensitive Personal Data) Order 2000, SI 2000/417); again without patient consent.

Lastly, there is always the ‘medical purposes’ condition of paragraph 8 of Schedule 3 to the Data Protection Act 1998:

8 (1) The processing is necessary for medical purposes and is undertaken by—

(a) a health professional, or

(b) a person who in the circumstances owes a duty of confidentiality which is equivalent to that which would arise if that person were a health professional.

(2) In this paragraph “medical purposes” includes the purposes of preventative medicine, medical diagnosis, medical research, the provision of care and treatment and the management of healthcare services.

Patients’ consent may not strictly be required by law, but under the first (and second) data protection principle, patients will have to be made aware by clinical commissioning groups that their data, in whatever form, for medical research purposes. Patients could try to exercise a stop notice right under s10 of the Data Protection Act 1998, but ‘good luck with that’ is the phrase that immediately springs to mind.

Although it may be lawful for commissioning groups to sell patient data, it may be that best practice will lead to any sale being restricted to Barnardised or anonymised patient data. This may be clarified by the NHS Commissioning Board, which has a responsibility to issue guidance on the processing of patient information under s13S of the 2006 Act, following the abolition of the National Information Governance Board for Health and Social Care in the 2012 Act. ‘Patient information’ in this context is a new term defined at s20A of the Health and Social Care Act 2008, and is broader than a patient’s personal data, as defined under the Data Protection Act 1998, to include any information about a person’s health, diagnosis or care or data derived from that information, whether that information or data identifies an individual or not.

So, a case can be made for saying that commercialisation of patient data may well be a consequence of the Health and Social Care Act 2012. Whether this consequence was unintended or was anticipated is for others to answer.

Advertisements

Twitter, Google and EU Privacy

EU Commission Data Protection Reform logo

At the end of February is was reported that Twitter was selling off old tweets to marketing companies. Google also, with effect from 1 March 2012, changed its privacy policy for all of its services. These include YouTube, Gmail and Blogger as well as the ubiquitous search engine. In neither case were users’ consents obtained for the transaction or changes. This raises a number of privacy and data protection issues. In Google’s case the EU Justice Commissioner, Viviane Reding, has gone on record saying “transparency rules have not been applied”. The French data protection authority, the CNIL, launched a European-wide investigation into the Google policy changes.

I predict that there will be more of these announcements and privacy policy tweaks during the coming months. Companies with large banks of users’ or customers’ data from the European Union have a small window of opportunity to commercialise that data before the implementation of a new European Union data protection regulation. The draft of this regulation was published by the EU Justice Commission on 25 January 2012. In its current draft form, the regulation will begin to apply 2 years from the date it comes into force. No national laws are required to bring an EU regulation into effect in a member state.

Companies will therefore have 2 years in which to rely on the more relaxed rules included in the Data Protection Directive 95/46/EC. In particular, some processing that can be conducted without the consent of individuals, where these are new uses of the individuals’ data which are in the “legitimate interests pursued by [the company] or by the third party or parties to whom the data are disclosed”, will become much more difficult, if not impossible.

The whole nature of consent is properly addressed in the draft regulation. In the Directive, data can be processed where there is unambiguous consent. In the UK implementation of the Directive, the Data Protection Act 1998, is has always been possible to obtain consent indirectly for data that is not “sensitive personal data”. Whilst this has been one of a number of long-standing issues between the European Commission and the UK on data protection, there is a new provision in the draft regulation that will address valid consent. Of particular interest in cases such as Google, which is a dominant operator in the search engine services market, is the draft provision that states “consent shall not provide a legal basis for the processing, where there is a significant imbalance between the position of the data subject and the [company]”.

This goes back to another of the significant changes in the draft regulation. In the Directive there is a basic provision that personal data must be “processed fairly and lawfully”. In the regulation, the equivalent provision is “processed lawfully, fairly and in a transparent manner in relation to the data subject”. Expect some interesting arguments about transparency in the coming months – perhaps these have already started, given Viviane Reding’s comments on the Google changes.

To make matters even more interesting, the draft regulation gives consumer bodies the standing to be able to complain to a supervisory authority about data protection breaches on behalf of individuals. Super-complaints, as they are known in competition law, will up the ante for regulators – easy for the Information Commissioner to downplay an individual’s complaint; less easy to ignore a complaint from a body such as Which? or the National Consumer Council?

Lastly, the draft regulation includes new powers for supervisory authorities, including the power to fine enterprises, in the worst cases, up to 2% of their annual worldwide turnover. That ought to grab the attention of companies like Google and Twitter.

The Browns’ damage or distress

Paper files of medical records

Paper files of medical records

What would you do if you were approached by a newspaper that wished to publish an article about your child’s illness?  Assuming you do not have the resources to instruct lawyers specialising in privacy and data protection to consider obtaining an injunction, you could look at a little-known and rarely-exercised right in the Data Protection Act 1998.

Section 10(1) & (2) of the Data Protection Act 1998 states:

10  Right to prevent processing likely to cause damage or distress.

(1) Subject to subsection (2), an individual is entitled at any time by notice in writing to a data controller to require the data controller at the end of such period as is reasonable in the circumstances to cease, or not to begin, processing, or processing for a specified purpose or in a specified manner, any personal data in respect of which he is the data subject, on the ground that, for specified reasons—

(a) the processing of those data or their processing for that purpose or in that manner is causing or is likely to cause substantial damage or substantial distress to him or to another, and

(b) that damage or distress is or would be unwarranted.

(2) Subsection (1) does not apply—

(a) in a case where any of the conditions in paragraphs 1 to 4 of Schedule 2 is met, or

(b) in such other cases as may be prescribed by the Secretary of State by order.

In the scenario being dealt with here, none of the conditions in subsection (2) apply.  As this right is rarely exercised, even less made the subject of any court proceedings, there is no judicial interpretation of what is required to meet the “substantial” level or where the line may be drawn between warranted and “unwarranted” for section 10.  However, it is a cost-free approach to issue a section 10 notice.  As this is a fundamental right under the Act, any recipient data controller ignoring it risks court action, or more likely, enforcement action by the Information Commissioner following a complaint by a person issuing the notice that their rights were ignored.

Although the Information Commissioner’s guidance on when he would be minded to issue a monetary penalty is not completely clear on this point, it is at least arguable that any denial of a section 10 right would be a severe breach of the Data Protection Act.  As a severe breach, it could be the subject of a monetary penalty notice, which can include a fine of up to £500,000.  The risk of being subject to a £500,000 fine, as well as the reputational fall out for a newspaper, might be enough to make a publisher think twice.

There is also the question of the lawfulness of the newspaper publishing the story concerning an individual’s medical condition.  In short, the publication is not covered by any of the lawful purposes for which medical data (included in the definition of “sensitive personal data” in the Act) may be processed. The only conceivable lawful purpose is contained in a statutory instrument, the Data Protection (Processing of Sensitive Personal Data) Order 2000. In particular, paragraph 3 of the Schedule to the Order states:

3.  The disclosure of personal data –

(a) is in the substantial public interest;

(b) is in connection with –

(i) the commission by any person of any unlawful act (whether alleged or established),

(ii) dishonesty, malpractice, or other seriously improper conduct by, or the unfitness or incompetence of, any person (whether alleged or established), or

(iii) mismanagement in the administration of, or failures in services provided by, any body or association (whether alleged or established);

(c) is for the special purposes as defined in section 3 of the Act; and

(d) is made with a view to the publication of those data by any person and the data controller reasonably believes that such publication would be in the public interest.

It is difficult to make a convincing case that knowledge of a child’s medical condition is in the substantial public interest for paragraph 3(a). Only the case of Leo Blair and MMR comes to mind as a possible example.  That, however, leaves the other conditions in paragraph 3 unfilled for this to be a lawful purpose.

However, newspapers can seek to apply the exemption at section 32 of the Act for journalism, literature or art.  The newspaper would have to be clear that publication was in the public interest (section 32(3)) and within the scope of the Press  Complaints Code (a designated code for the purposes of section 32 under the Data Protection (Designated Codes of Practice) Order 2000 – it is an anomaly that the sensitive personal data Order described above imposes a “substantial public interest” test in connection with journalism (the “special purpose” in paragraph 3(c)), whereas section 32 does not).  Note paragraph 6(v) of the current edition of the PCC Code to Editors, and point 5 of the note on the public interest test to be applied in matters concerning children:

v)  Editors must not use the fame, notoriety or position of a parent or guardian as sole justification for publishing details of a child’s private life.

5.  In cases involving children under 16, editors must demonstrate an exceptional public interest to over-ride the normally paramount interest of the child.

Clearly, the section 32 exemption must be one being relied upon by News International in connection with the publication of Fraser Brown’s medical condition.  It is disappointing, but perhaps not surprising in the circumstances of the relationship between No 10 and News International in 2006, that no complaint was made about the Fraser Brown report that would have given the Information Commissioner’s Office or a court a chance to describe the limits of section 32, or to resolve the conflicting public interest tests in section 32 and the sensitive personal data Order.

If you consider that section 32 gives newspapers too much leeway, then note that the exemption does not cover section 13 of the Act.  In particular, section 13(2)(b) provides, in effect, that “an individual who suffers distress by reason of any contravention by a data controller of any of the requirements of this Act is entitled to compensation from the data controller for that distress if… the contravention relates to the processing of personal data for the [purposes of journalism]”. It would therefore be the case that if the Information Commissioner, as a result of a complaint, or a court ruled that the newspaper had not published (sensitive) personal data in the public interest, then the individual concerned could sue the newspaper for distress. This would be in addition to any monetary penalty imposed by the Information Commissioner for the contravention.

To date only Naomi Campbell has obtained such distress damages (Campbell v Mirror Group Newspapers [2002] EWHC 499 (QB), subsequently upheld by the House of Lords [2004] UKHL 22). Although not clearly identified as such, it would seem that these damages amounted to a modest £1,000, out of a total award of £3,500 damages under section 13 of the Act and for breach of confidentiality. The low level of these damages has itself probably deterred section 13 actions against newspapers.

Is 17p per unsecure online file a fair monetary penalty?

Scales of Justice © Alex Proimos

On 10 May 2011 the Information Commissioner imposed a £1,000 monetary penalty on Andrew Crossley, trading as ACS Law, for a serious breach of security that permitted over 6,000 individuals’ details to be accessible on an unsecured website. Already, there is much discussion on the internet as to the fairness of this penalty. Has justice been done?

Internet comment is not likely to be objective in the case of ACS Law, given that it was the law firm targeted by hackers for a distributed denial of service attack as retribution for its perceived aggressive approach to internet users claimed to be illegal copyright infringers (see the Wikipedia entry for ACS:Law). Andrew Crossley was reported to have made a profitable business pursuing copyright infringement cases – claiming he would be buying a Ferrari F430 Spider for cash.  The monetary penalty of £1,000 amounts to less than 17p for each individual’s details that ACS Law left unsecured on its website as part of its recovery from the DDOS.

However, the Information Commissioner has made it clear that had ACS Law still been trading, he would have imposed a monetary penalty of £200,000 (the maximum that could have been imposed was £500,000). Clearly the Information Commissioner was satisfied by the written representation sworn on oath by Andrew Crossley to reduce the fine to the token £1,000 – much to the chagrin of many on the internet less willing to accept Crossley’s pleas of reduced financial circumstances.

Other data controllers ought to reflect on the factors considered by the Information Commissioner in making the monetary penalty. In particular, the lack of investment in appropriate security measures was a major factor, as was the lack of appropriate IT trained personal in the organisation. In addition, whilst spending serious money to remedy the security breach (in ACS Law’s example, spending £20,000 to fix the problem) was considered as a mitigating factor, it was obviously not that significant given the level of the final penalty.

Lawyers and law firms also ought to take particular note that as far as the Information Commissioner is concerned, they cannot expect any leniency for any breach by them of the Data Protection Act 1998 – “Data controller is a lawyer and should have been fully aware of his obligations under the Act.”

School or Home Tuition Sales Agency?

Education Guardian

An article in yesterday’s EducationGuardian caught my eye, and not because of the parental anger at schools pressure selling private tuition. What concerns me is the blatant direct marketing of a third party’s services by a school.

The story, if you did not click on the link, concerned parents’ anger at receiving letters signed by school headteachers on school headed notepaper marketing a DVD home tuition scheme of the Student Support Centre, a trading name of The Student Support Centre (UK) Limited (who, incidentally, fail to give their company name anywhere on their website, as far as I can see).  EducationGuardian discovered that the Student Support Centre pays schools an administrative fee for this marketing, but the article is unclear about what this fee is.  Anthony Lee, founder and chairman of the Student Support Centre, is quoted as saying he makes a “small token payment of up to £160”.

From a data protection point of view, the most obvious question is, do the schools that cooperate with the Student Support Centre or other home tuition companies include direct marketing as a purpose in their data protection notices?  Personal data, which in these circumstances must include the parents’ names and addresses for school pupils, must be processed in a fair and lawful manner.  To be fair and lawful, the person who “owns” the personal data must give their details and the  purposes for which the data will be processed (see paragraph 1 of Part I – the First Data Protection Principle – and paragraph 1 of Part II of the Data Protection Act 1998, Schedule 1).  Also, personal data cannot be processed in any manner incompatible with any stated (and notified) purpose or purposes (paragraph 2 of Part I, the Second Data Protection Principle).

There is no data protection fair processing notice or privacy policy for the primary school discussed in the story, Towerbank primary, or Edinburgh City Council, available on their websites, so I cannot comment on the school in question.  However, I would not expect direct marketing to be a usual purpose notified to parents.  As an example of a model schools’ data protection fair processing notice, I am pleased to see that my local council, Hampshire County Council, has an excellent precedent (see here).

Schools may wish to consider whether receipt of a small token payment is enough of an incentive to breach the Data Protection Act 1998, for which monetary penalties of up to £500,000 can be imposed by the Information Commissioner for serious breaches.