Prepare for an artificial intelligence (AI) takeover.
California was right to eliminate its cash bail system Aug. 28. However, replacing that policy with an e-carceration plan is not a solution. The new system, which uses a computer algorithm to determine the length of an individual’s sentence, fails to reduce incarceration rates and protect against racial prejudice as it promises. Based on incarceration records in multiple states, the American Civil Liberties Union revealed that police reports, which algorithms use to compute their decisions, are a main source of racial prejudice in the cash bail system. Specifically, such an algorithm takes into consideration past arrests even if there has not been an actual conviction. Given that arrest rates are always disproportionately higher in colored communities, this algorithm magnifies the bias its creators seek to avoid. Yet, people continue to support this system, which speaks volumes about the absurdity of their growing dependence on technology.
Yes, computers can provide more efficient and accurate processing of extensive data, but if that data is already compromised by racist opinions, the risk-assessment algorithm could be as biased as—or even more so than—a human judge. California gives the risk-assessment policy too much credit. Rather than acting as aid, the decisions made by the risk assessment policy may negate the decisions of the acting judge. Such a policy is illogical because it does not compensate for any malfunctions the algorithm may have.
For those awaiting trial, a computer algorithm greets them. Those who have been released for parole fare little better with electronic monitoring shackles than traditional parole protocol. Electronic monitors, which take the form of ankle or wrist bracelets, have been in use already for over three decades by law enforcement. According to The Pew Charitable Trusts in 2015, the usage of these monitors has increased by 140 percent in the last decade. Far from humane, most of these devices use GPS tracking and other means of constant surveillance to limit the rights of the convicted. Electronic monitors are another form of prison.
The risk-assessment algorithm and electronic monitoring devices are not the only form of technology taking over. AI is being integrated into almost everything—search engines, virtual assistants, self-driving cars, medical diagnosis, etc. In fact, Stanford University found that the number of AI startups has increased 14-fold since 2000. Of course, innovation is necessary, and efforts to improve technology should not be halted for fear of an impending robot revolution. New technology is not frightening nor is it the issue; the increasing number of people depending upon it is the problem.
People are putting too much faith into AI. In a study done by Harvard University, over two-thirds of respondents would allow AI to control some aspect of their life. However, according to Forbes, over 50 percent of people do not even realize that they interact with AI on a daily basis. The other half of people, those who are aware, are attracted to AI because it is convenient and relieves them of responsibility by making decisions for them. Although there is no stopping the AI industry from growing, people simply need to be conscious of the type of technology they are surrounding and trusting themselves with.
AI surpassing human intelligence is also an issue. While it is not likely to occur anytime soon, the possibility reminds humans to treat AI with proper caution. AI can have many unforeseen consequences that cannot be regulated with a simple update especially if it is applied to a large population. We still do not fully understand the capabilities of AI, and we should not hastily and irrationally depend on it.
As AI has grown more popular, people have become deceived by its promise of removing bias and bringing convenience to the extent that they are willing to have technology rule their lives and decisions. AI should be designed to supplement—not replace—human agency. People are not only locking up criminals, but also themselves in the cage of technology.