In an attempt to weed out bad actors before any improper actions are taken, JPMorgan Chase & Co. (“JPMorgan”) is implementing a computer surveillance program to identify potential problem employees. The inputs in JPMorgan’s computer algorithm identify, among other things, workers who miss compliance classes, employees who violate personal trading rules, and workers who break market-risk limits. The program monitors telephone transcripts and e-mails for key terms that are believed to be problematic or indicative of future bad acts. Thousands of asset-management and investment-bank employees will be subject to JPMorgan’s new surveillance program.
The price of defending any potential lawsuits resulting from this program may seem to be a necessary exchange compared to the tens of billions of dollars JPMorgan paid its attorneys since the financial crisis began...
An employer’s right to review emails or texts sent on work devices is not a new concept in and of itself. Company policies and procedures may outline this right, or the employee may have even signed a formal acknowledgement of their employer’s right to search their emails. For example, according to the Supreme Court in City of Ontario, Cal. v. Quon, a search by the city of an employee’s private text messages on his city-issued pager did not violate the Fourth Amendment because the search was “motivated by a legitimate work-related purpose” and “was not excessive in scope.” 560 U.S. 746, 764 (2010). For private employers, however, even these restrictions would not apply. Therefore, employees already have limited privacy to any emails or texts sent on work devices unless their employer provides otherwise.
JPMorgan’s algorithm approach, however, goes one step further by attempting to weed out bad actors before they actually commit any bad acts. Employees may not have committed any offense, but, per the algorithm, appear to be headed down a path of collusion or improper activities. JPMorgan sees this program as “predictive monitoring” – a proactive step to aid in detecting intentions and preventing bad acts. What remains unclear, however, is what happens when JPMorgan’s algorithm actually identifies a potential problem employee. Is that employee given a warning? Monitored more closely than his or her peers? Terminated? Particularly troubling is what happens when one of these employees is in a protected class or there is an allegation of targeting a particular group.
...the very existence of this surveillance program may act as a deterrent to bad behavior.
While this program and any resulting penalties or terminations may create exposure, JPMorgan is no stranger to legal fees in other areas. The price of defending any potential lawsuits resulting from this program may seem to be a necessary exchange compared to the tens of billions of dollars JPMorgan paid its attorneys since the financial crisis began.
Finally, in addition to actually predicting behavior patterns and identifying problem employees, the very existence of this surveillance program may act as a deterrent to bad behavior. In Discipline and Punish: The Birth of the Prison, Michael Foucault describes Jeremy Bentham’s prison model, the Panopticon, where prisoners would be coerced into good behavior based on the knowledge that they were being (or could be) constantly watched by the prison guards. The constant “gaze” of the Panopticon was intended to induce in prisoners “a state of conscious and permanent visibility that assures automatic functioning of power.” While only time will tell, the constant (or potentially constant) gaze of JPMorgan’s surveillance program may serve to check bad behavior before it begins.