I used chatgpt4.5 to fact check claims made about judicial bias correlating to things such as time of day, weather, sports team performance. What I learned is that at best the assertions you make are from analysis of small studies, &| studies with questionable methods &| the general assertion has a great deal of research which does not support the assertions made.
I expect bias in human judges does exist. The stated examples do not demonstrate this clearly.
AI worries me because it removes vital elements of being human. If generative AI can write everything I ever need, why would I make any effort to do it myself? If generative AI answers every question (badly, might I add - or through plagiarism, such a good baseline for a research engine) why bother looking for information myself? The problem is that the human brain needs interaction to learn. Getting AI to write you cliff notes (or do your work for you) reduces your ability to use any of that information later.
You say that humans are fallible - won't deny it, but you can track those insecurities and biases and mistakes without too much trouble. We've been doing it for millennia. I'm just not willing to lose brain capacity so a program can make my life "easier". YMMV.
I used chatgpt4.5 to fact check claims made about judicial bias correlating to things such as time of day, weather, sports team performance. What I learned is that at best the assertions you make are from analysis of small studies, &| studies with questionable methods &| the general assertion has a great deal of research which does not support the assertions made.
I expect bias in human judges does exist. The stated examples do not demonstrate this clearly.
https://chatgpt.com/share/67ecb99d-f3fc-8011-9548-5d0d7f626b0d
AI worries me because it removes vital elements of being human. If generative AI can write everything I ever need, why would I make any effort to do it myself? If generative AI answers every question (badly, might I add - or through plagiarism, such a good baseline for a research engine) why bother looking for information myself? The problem is that the human brain needs interaction to learn. Getting AI to write you cliff notes (or do your work for you) reduces your ability to use any of that information later.
You say that humans are fallible - won't deny it, but you can track those insecurities and biases and mistakes without too much trouble. We've been doing it for millennia. I'm just not willing to lose brain capacity so a program can make my life "easier". YMMV.