An algorithm is a set of machine code instructions

                        An algorithm is a set of machine code instructions that programmers compile into data that is then executed within a program. The first known algorithm was created by a mathematician named Liu Hui in China during the 3rd century, who would create algorithm sets to help understand the mathematical makeup of shapes, as well as understand other forms of mathematics such as system of equations. As time progressed, other nations around the world, from the Middle East to Europe, developed their own algorithms that helped with calculations within math and science. Algorithms make human beings’ lives easier by completing everyday tasks that can be too complex for the human brain to handle in a quick manner. For example, algorithms are found within the complex wiring and programming of traffic lights, air traffic control for aviation, heart monitors in hospitals, hospital file encryption, and crime risk assessment within the judicial system. There are also algorithms that control the process of receiving a mortgage, credit cards, etc.—these are all fields where not just time is of the essence, but accuracy, too.

            The use of crime algorithms within the judicial system came about when the 2007 Second Chance Act was implemented. The SCA was a program that was approved by congress that sought different ways to improve the safety of communities across the country by reducing the chances of former criminals being able to repeat the same crime and or new crimes. The SCA also paved the way for a more radical approach to rehabilitating former criminals. Since criminals were re-entering society, it was important that the criminals have meaning to their life. For example, many inmates, once released from prison cannot obtain normal, full-time employment from any company that they choose, due to their prior criminal record. Also, many inmates cannot get a college education due to the lack of access to both state and government funded programs such as FAFSA (Free Application for Federal Student Aid), obtain a license (if former criminal offender did not already have one, car and home loans, etc. The Second Chance Act also made it possible through different state and federally funded programs, to help get inmates readjusted to society’s demands. Such programs consist of releasing criminals earlier from prison due to good behavior, the Second Chance Act Grant program that helps former criminals financially (even former criminals of federal prisons), The Up Center that helps young fathers bond with their children, reducing youth recidivism and many other programs.

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!


order now

                        Out of all the concerns about the functioning of algorithms, accuracy is of course, most important. In the judicial system, before the implementation of algorithms, many offenders’ rights were violated when case workers receive data via questionnaires that offenders fill out that consists of offenders revealing sensitive information such as their family background such as the number of parents and or guardians within the home, emotional stability within the past and present, level of education completed, number of jobs held over their lifetime, income level, medical and mental issues. This data is then collected and compiled into a massive database which is then turned into an algorithm that will be used as an assessment tool to predict whether an offender fits the data collected from the risk assessment questionnaires. Many judges, DA’s, caseworkers and lawyers are on the side of what is right and just, but there have been many instances of the judicial system failing offenders (who were either first time offenders and or low-risk) because factors such as personal upbringing (ethnicity, race, neighborhood, low-income family, child of divorced parents, education level, employment history, etc.) are incorporated into the data and then that data is compiled into crime algorithms, and those algorithms end up returning faulty crime risk scores. What can end up happening is that a low-risk offender could get a high-risk score and a high-risk offender could get a low risk score, causing disastrous consequences.

For example, seventy-one-year-old Edward French, a professional within the film and photography industry was gunned down by nineteen-year-old Lamonte Mims and twenty-year-old Fantasy Decuir. French’s camera was stolen from him by Mims ended up robbing French and it was ultimately Decuir who ended French’s life with a gun. Mims was not a stranger to the judicial system and he was analyzed by a crime assessment tool. Mims was already a felon who had served time for theft and both of their profiles were registered within a crime risk assessment pretrial tool to help judges determine if criminals will re-commit crimes (whether petty crimes or more serious crimes). The judge who handled the case, San Francisco Superior Court Judge Sharon Reardon, after reviewing the results from the assessment of Mims’s profile determined that Mims was not a threat even though circumstances resulted in French losing his life.

Algorithms are both respected and feared. For example, algorithms are highly sought after within courtrooms and police vehicles around the country. John Arnold, former Enron CEO who, along with his wife, created the Public Safety Assessment tool (PSA). Arnold created The Public Safety Assessment Tool in a response to the growing bias within the judicial system that was sending innocent people to jail for crimes they did not commit. Worse, low level crime committed by low-risk offenders was being classified as high-risk causing unjust, unusually long jail times and pay outrageous fines (and most of the time, the low-risk offender cannot afford the bail and end up staying in jail. Arnold claims that The PSA Tool can be used as an aid in helping judges make more informed choices during sentencing and lessen the threat of human bias interfering with the defendants’ verdict “flags those defendants who present an elevated risk of committing a violent crime” (Arnold). The Arnolds claim that the data that is collected with PSA is neutral and tracks the patterns of violent offenders who have an extremely elevated risk to re-offend if released from judicial custody. Factors that are weighed are the offenders age, pending charges/felonies, prior incarcerations, failure to appear at their court dates, etc. Each category of offences is given the numbers one through six (with six being the greatest risk) that courtrooms can use when dealing with offenders. With consent from the Arnold foundation, the PSA software can only be donated to courtrooms by the Arnold foundation only.

Another crime risk assessment tool is a new and it is called the Strategic Subject List

(also known as the heat list)—heat list meaning offenders who are more than likely to reoffend using a weapon such as a gun—was created by the Illinois Institute of Technology in 2013. The SSL has been helping on-duty police officers who are patrolling different communities within Illinois (mostly communities that are dealing with high rates of crime) monitor former offenders by assessing their chance to reoffend by accessing the former offenders case file that they can easily pull up on their computer monitor dashboard. Inside the SSL program’s algorithm, the risk score starts at one and can easily go over five hundred (the highest risk) depending on the number of convictions, types of convictions, etc. If the threat level is high, the individual is more likely to be labeled as a PTV, or a party to violence—reoffend shootings and murder. Currently, there are around four-hundred thousand people on the SSL and most of the people are youth and there is no one aged thirty or older who has a risk level of over four-hundred. What is interesting about this program is the controversy that surrounds it which leads to the bigger question when it comes to algorithms; just how much information is being collected and how come offenders as well as the public (especially civil rights groups) do not have access to this information that can be used to help and or ruin someone’s life?

To date, Illinois police will not release any information as to what is within the SSL algorithm, stating that it would be in violation of source code and patent rights. However, the decision of the Illinois police to not disclose how the SSL algorithm works is the same type of decision that is reached across the country, where there is more protection for machine code and patent rights than human rights. When it comes to algorithms, there is no transparency. Civil rights activists’ groups are worried and furious as to why there is not more control on what kind of data is being collected, especially when algorithms can be used to deny people certain access to programs because of their struggling economic status, ethnicity, race, or even cause some people to be unfairly jailed due to racist, algorithmic bias. For example, The Washington Post discovered that even though two people can commit the same identical crimes, due to racial biases within data and other forms of criminal documentation, a person who was black received a heavier jail sentence than the white person who again, had the same type and number of criminal activities listed on their record because a black person is more likely to have an arrest history (due to historic economic, racial bias within the community that they once and or currently reside in). These algorithms because they are not controlled by both compassion and reasoning, only analyze data and if that data is biased, then the results will be bias.

When mentioning bias in algorithms, it is important to note that programmers themselves are not entering bias into these algorithms, either. Instead, these algorithms are learning bias when they read data from databases that pull case file information from risk assessment programs, the internet, crime logs, and other analytic findings, and if the data is not correct, then the assessment of individuals will not be correct, resulting in gross bias that can easily be brought into the judicial system. For instance, twenty-year-old fantasy Decuir who murdered Edward French.

The Pacific Standard featured an article that addresses the startling evidence of algorithm’s being used within the judicial system that have algorithms that will increase the arrest rate of black people because of a mathematical prediction that incorrectly labels them as up and coming criminals. One algorithm called COMPAS (Correctional Offender Management Profiling for Alternative Sections), created by a company called Northpointe, is used in several states around the country. COMPAS helps the judicial system decide what offenders are at the greatest risk to commit more crimes. The program COMPAS retrieves data from a total of over one-hundred questions, but what is surprising is that none of the questions concern race. However, there is still racial bias within the program due to reoccurring situations around the country such as the case of Dylan Fugett who was arrested for drugs (cocaine and marijuana) and was arrested multiple times after the first arrest, was still given a low risk-score of three because the data that is within the COMPAS was biased, whereas Bernard Parker, an African-American man was given a high-risk assessment of ten due to racial, algorithmic bias. There is even bias within everyday software that we use from facial recognition software being able to only recognize white facial features and digital cameras that adjust to lighter and or paler skin more easily when it comes to light balancing then brown and or darker skin. Many of the troubles that we as human beings have with algorithms is that they function on data alone. Algorithms are not able to comprehend the complexities of human nature that consists of not only logic and reasoning, but also human connections that are infused with emotion. These multiple factors create an entire individual, and along with socio-economic plusses and or disparities. Facebook currently has an algorithm that protects white men’s hate speech yet the same algorithm will remove a pro-black post such as Black Lives Matter Movement and disable the person’s Facebook for about a week., or even the startling fact that many people of color do not get employment call backs due to their Ebonics-sounding name

What is interesting to learn about algorithms when it comes to the issue of sex offenders is that, for instance, the crime algorithm that is used within the state of Massachusetts is not used when dealing with offenses that are not connected to sexual violence, but the crime algorithm is used for sex offenders and other types of sexual predators. However, risk assessment program use within Massachusetts may change. As of November 2017, MIT, Harvard and other academics and officials are warning the legislature about the dangers of bias within algorithms. Now, there is a small program operating in Massachusetts called The Massachusetts Arrest Screening Tool for Law Enforcement (MASTLE) that can determine if certain at-risk youth are more than likely to recommit a crime.