Language selection

Search

Screening for Security, Accounting for Privacy

This page has been archived on the Web

Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.

Address given to the MacKenzie Institute, “Security Matters” Conference

Toronto, Ontario
October 15, 2015

Address by Patricia Kosseim
Senior General Counsel and Director General, Legal Services, Policy, Research and Technology Analysis Branch

(Check against delivery)


Thank you to the Mackenzie Institute for inviting us back to your conference this year.  It is a wonderful opportunity to learn from you, renowned experts in the field of security, which helps make it “real” for us as data regulators.

Your conference this year is aptly entitled “Security Matters”.  Does security matter?  Of course it does.  Was I scared the morning of October 22, 2014?  Darn right I was.  Were it not for a photocopy glitch that caused delay, one of our employees would have been in the very corridor, at the very moment, the shootings occurred on Parliament Hill to file our Annual Report.  My kids go to school only a stone’s throw away from the tomb of the Unknown Soldier where Corporal Nathan Cirillo was fatally shot.  But I am not here as a mother, nor as an employer.  I am here in my role as representative of the Privacy Commissioner, and my main message to you today is simply this:  Just as security matters, privacy does too, especially in a free and democratic society like Canada where people want both — or at least a healthy balance between the two.

The title of this panel, more specifically, is “People Vulnerabilities” and for the next few minutes, I will be addressing potential People Vulnerabilities from Within, and the privacy implications of trying to screen them out.

To begin, it’s important to emphasize that just as you all have responsibilities regarding security and safety you also have legal obligations with respect to privacy. 

In the federal government context, which is governed by the Privacy Act, institutions are obligated to collect only personal information that is directly related to their operating program or activity, and use it only for the purposes for which it was collected.  That includes information collected as part of security screening activities. 

The policy regarding security screening, which is set by the Treasury Board Secretariat, has recently been amended to allow for much broader collection than had previously been the case.  We have been told that the driver behind this policy change is partly to standardize what departments are doing in this area, to ensure that a “secret” clearance with department A is the same as a “secret” clearance with department B.  This should reduce duplication and, potentially, save time and money.

However, there are elements of this policy which have presented problems, resulting in a court action which I’ll speak about in a moment.  Specifically, for the first time, all public servants applying for a security clearance — even the lowest level, the “Reliability Status” — will be subject to mandatory credit checks and law enforcement enquiries before they assume the duties of the position for which the clearance is required. 

The Standard also rolls out the collection of fingerprints broadly, rather than only in cases where name checks are inconclusive, and introduces much broader inquiries as to applicant’s internet activities, including visits to websites, social networking sites, videos shared, and postings to wikis or blogs. 

And finally, the new policy moves away from a “point in time” assessment conducted upon application or expiration, to a more fluid and ongoing process.  The policy’s new “aftercare” component involves ongoing monitoring of individuals who have already been granted a clearance to ensure their reliability is still valid, and places the onus on employees to report changes in their financial situation (such as bankruptcy or unexpected wealth) and may, in certain cases, be required to report changes in personal circumstances, such as marital status. 

The Professional Institute of the Public Service of Canada, the largest public service union representing scientific and professional public servants implicated by this new policy, alleges that these measures are unduly invasive and constitute a breach of their collective agreement.  They believe that while this kind of searching is appropriate for those who hold higher levels of clearances, and come into contact with sensitive information and assets as a routine part of their employment — namely those holding “Secret” or “Top Secret” clearances.  Those who are applying for lower levels of clearance — the “Reliability Status” holders — need not be subjected to these additional measures.  In short, they believe the policy to be disproportionate and excessive. 

Given that, earlier this year, PIPSC filed for an interlocutory injunction with the Federal Court asking for the implementation of this new policy to be halted pending a judicial review.  One of the issues raised in the injunction was the allegation that certain security clearance measures violate section 4 of the Privacy Act, which says:  “No personal information shall be collected by a government institution unless it relates directly to an operating program or activity of the institution.” 

The union asserts that all this additional information — web browsing history, “liked” posts or YouTube videos and so on — has nothing to do with the day-to-day workings of the department requesting the clearance.  The federal employer argued that security screening itself is an “activity” of the institution and as such, they should be permitted to collect whatever information they need to ensure they’ve got the right people in place.   

Late last month, the Federal Court refused to grant this interlocutory injunction, finding that PIPSC had failed to demonstrate that the applicants would suffer irreparable harm if the policy is not immediately halted, and that on a balance of convenience, taking into consideration both the applicants’ interests and the public interest, it would be better to continue with the roll-out of the new standard until the court can hear the parties and decide on the issues on their merit.  So even though the interlocutory injunction was not granted, the issue of whether the policy is compliant with the Privacy Act remains to be determined by judicial review.  Stay tuned!

As I mentioned, one of the reasons this policy is being proposed is to standardize screening practices.  I’d like to tell you about a situation where we felt a government department went above and beyond the usual screening procedures, and ventured into activities we considered to be unacceptable.    Several years ago, it came to our attention that the Canada Border Services Agency (CBSA) had implemented an enhanced personnel security screening measure through the creation of what they called the “High-Integrity Personnel Security Screening Standard” (HIPSSS). 

This Standard, unique to CBSA, built upon the baseline set by the Treasury Board Secretariat but augmented the screening in several ways; querying RCMP and CSIS indices, for example, requiring additional information on residency and travel activities, and so on.  However, one such augmentation gave our officials pause:  the deployment of a written Integrity Questionnaire for all candidates which included highly intrusive questions about drug use, alcohol consumption, gambling, hiring of prostitutes, sex tourism or the downloading of certain kinds of images.

This came to our Office’s attention through the Privacy Impact Assessment process, which requires institutions undertaking new activities involving personal information to assess the privacy risks thereof.  We received the PIA on this process, and were troubled by the kinds of questions being asked in the Integrity Questionnaire.   While we acknowledged the importance of thorough screening practices, we found the questions strayed from necessity into invasiveness, were dangerously vague and, moreover, CBSA was unable to make the case for asking them-- particularly for job duties that did not require access to sensitive information or assets.  We also were unconvinced that this information, if collected, could be safeguarded appropriately — and the consequences of loss or theft of such intensely personal information would be grave indeed.  Furthermore, we noted that the results of these questionnaires would be stored in a database which lacked audit trail capacity, so there would be no way of knowing if someone had snooped at their contents without authorization. 

After much consultation and discussion, CBSA agreed with our assessment, eliminated or modified questions unlikely to elicit information about an individual’s loyalty or security, and replaced the proposed questionnaire with Integrity Interviews, which are arguably less invasive.

Furthermore, we gave CBSA feedback on the development of training material for those conducting these interviews to ensure that only information which is necessary is collected, that CBSA clearly explain to the interview subjects the purpose of such collection, and that the repositories in which this information is stored are as safe as possible. While we still have concerns about some aspects, we believe that CBSA benefited from our back-and-forth on this file, and that ultimately, we were able to strike a better balance between privacy and security.

And in a context such as this, it’s worth saying explicitly that this isn’t a “zero sum” game.  It is possible to ensure the employer’s need to have the right people with the right clearances in place for the job to be done, while protecting the privacy of their employees as required by law. 

On that note, I’d like to describe to you another situation involving security screening which came to our attention from one of our oversight colleagues.  In their 2013-2014 annual report, the Security Intelligence Review Committee or SIRC — the body that oversees the activities of the Canadian Security Intelligence Service — reported on a review they conducted of CSIS’s security screening practices.  By way of background, for over thirty years, CSIS has been mandated to do two main things:  collect threat-related intelligence and conduct security screening activities.  The former is, of course, done without the knowledge or consent of those under surveillance; to do otherwise would compromise the very purpose of intelligence collection.  In contrast, CSIS’s security screening activities are done with full consent of the subject — if you want a clearance, or need to maintain one as a condition of employment, you agree to give up certain elements of your own personal information.  

From time to time, SIRC reviews CSIS’s security screening activities to ensure that they are being conducted in a manner which respects the law, policies and Ministerial Directives.   Their last review of these activities raised an issue which implicates our office.  Specifically — and let me quote from the SIRC report –  they “identified a serious concern that changes to the internal use by CSIS of the information it collects for security screening purposes could be in contravention of the Privacy Act, or could leave room for abuse regarding the use of such information.”Footnote 1   Because the public report is unclassified, only so much detail is provided. 

Suffice it to say that Privacy Act is clear that you can’t collect personal information for one purpose and then use it for another, unrelated purpose.  SIRC recommended that CSIS consult with our office on changes affecting the internal use of information collected for security screening purposes; our discussions are ongoing.  

Let me raise one last example courtesy of today’s Globe & Mail article referring to a little known “far-reaching program of financial surveillance” that was further expanded with the adoption of Bill C-31, the omnibus bill of last year that amended the Anti-money Laundering & Terrorist Financing Act.  The Bill brought about a broader concept of “politically exposed persons” that now includes domestic senior public office holders, elected politicians and senior bureaucrats, as well as their families and close associates - both personal and professional -- going back and forward twenty (20) years.  This was intended to bring Canada in compliance with its international obligations set by the Financial Action Task Force (FATF).

Financial institutions are now required to monitor the financial transactions of these domestic politically exposed persons, and report suspicious activity to FINTRAC, who in turn, can share this information with the RCMP, CRA (and now also the seventeen (17) departments and agencies named in Bill C-51, if the information is considered relevant to their national security mandates). 

A member of the Advisory Committee to the federal Finance department describes the regime in today’s paper as “a very wide net that catches very few fish”. 

Two of the fishes caught in that net were Louise Arbour’s daughters, one of whom saw her bank account frozen when she refused to answer her bank’s questions.  For our visitors here from outside Canada, you may wonder who is Louise Arbour?  She is a former Supreme Court of Canada Justice, a former International War Crime Prosecutor, and a former United Nations Human Rights Commissioner.

Although the article bemoans the fact that this little-known provision was not discussed in Parliamentary Committee, our Office did raise issues about it at the time in our appearance and in our submission which you can find on our website. 

So where does this leave us? How can privacy risks be identified and mitigated in a manner which allows for appropriate security?  I spoke earlier of the Privacy Impact Assessment; our Office believes this to be a valuable tool in accomplishing this end.  You may be much more familiar with its close cousin, the “Threat Risk Assessment”, which is commonly used by experts like you to identify threats to systems, determining the level of risk these systems are exposed to, and recommending the appropriate level of protection.  The PIA works in a similar way, but identifies and assesses privacy risks throughout the development life cycle of a program or system. 

At its core, a PIA is a mechanism for starting a conversation around how personal information practices of institutions are compliant with the Privacy Act, and what personal information is being collected by an institution’s program, for what purpose, how it is to be used, with whom it will be shared, how it will be safeguarded and how, ultimately, it is to be disposed of when its useful life is over. 

This process is one of ways that you can ensure your legal obligations vis-à-vis privacy and the protection of personal information can be met.   And it is also used as a way of opening constructive and very practical dialogues with our Office through which we hope to arrive at a very well-balanced formula meeting your security objectives while addressing privacy obligations.

Thank you and I look forward to your questions.

Date modified: