Cyberterrorism - AI, Algorithms and Robotics - AI and Criminal justice

7 important questions on Cyberterrorism - AI, Algorithms and Robotics - AI and Criminal justice

Why would we apply AI in Predictive Policing, but what are the concers?

  • Application: Utilized in the pre-procedural phase to predict potential crime locations.
  • Example: In the USA, AI is used to provide police officers with areas of the city where crimes are more likely to occur.
  • Concern: Predictive policing may reinforce existing societal biases.

What is the function of AI in the Procedural Phase?

  • Functions:
    • Producing evidence through facial recognition and reading brain activity.
    • Tampering with the creation of evidence.
    • Judicial decision-making or support (e.g., Loomis vs Wisconsin).
  • Discussion Point: Raises questions about the role of AI as a judge.

What are the two principles in AI-assisted Judging?

  • Principle of Legality: AI usage by judges needs legal recognition, for example if a judge wants to use ChatGPT it first has to be recognized by law.
  • Principle of Individualization of Punishment: the punishment and the trial will still have to be individualized. Looking at every case to a human, because every case is different.  Systems like COMPASS in the USA assign scores to individuals, determining the level of danger (but the system could be wrong, which would lead to mistakes being made).
  • Higher grades + faster learning
  • Never study anything twice
  • 100% sure, 100% understanding
Discover Study Smart

Could AI discriminate and have disinformation in its system?

Bias happens due to 2 problems:
  • Discrimination present in the data: this data will feed the system and the final decision will be coherent to the given initial data
  • Bias inside AI system (black box) 

Could AI create the possibility of violation of personal data?

System such as ChatGPT are provided by private companies, so when we give our personal data to those companies, we aren’t sure if in that State they are sharing the same privacy terms as our country.

Does Machine Bias happen irl?

One example of discrimination that happened in the USA is the case Loomis vs. Wisconsin. There's software used across the country, named COMPAS (Correctional Offender Management Profiling for Alternative Sanctions), to predict future criminals, and it's biased against blacks. These tools are used by judges to assess an individual's risk of reoffending: this can also bring to apply coercive measures or even the final sentence based on it.

What is the Stochastic parrot concept?

We have to remember that these tools don’t have a soul, and they are not aware about what they write. The machine, it's only producing coherent text that makes sense, but doesn’t match reality, or it’s not true.

The question on the page originate from the summary of the following study material:

  • A unique study and practice tool
  • Never study anything twice again
  • Get the grades you hope for
  • 100% sure, 100% understanding
Remember faster, study better. Scientifically proven.
Trustpilot Logo