RonPMexico

RonPMexico t1_iu2cfec wrote

I'm saying when you artificially favor one race over another in an otherwise race neutral algorithm to give your desired results it's a bad thing. You believe race should factor into everything. And you have the temerity to claim the moral high ground. Racism is bad and you ought to be ashamed of your views.

0

RonPMexico t1_iu2axl8 wrote

So you are saying they can't be race neutral and you can't define when it's racist. Who gets to decide where to draw these arbitrary lines? How would they work with optimized systems? What is fair enough?

−1

RonPMexico t1_iu28ega wrote

I know. Thats what we are discussing. You take the view if an algorithm returns results that are not directly proportional to racial demographics the system is racist. I'm saying that is ridiculous.

What doesn't convey meaning is:

If companies ever backchecked their algos for mistakes or systematic bias, I might not be against it.

0

RonPMexico t1_iu25qoq wrote

Have you considered the opposite case? Using the real estate example. You have x number of variables including salary, school district, visiting real estate websites, and so on. Each on of those variables is given a weight by the system. We don't know what those weights are, the systems operates in a "black box" to determine the appropriate values. You look at the results and decide native Americans are under represented. now you have to add native American as a variable and in order to get the results you want you have to decide how much that should impact the final results. So who decides to favor native Americans by how much? Would that not be illegal under the fair housing act?

1

RonPMexico t1_iu1xhnz wrote

The only way the algorithm would exclusively advertise to whites would be if that was an explicit direction given to the system. If you program the model to sell advertise at the highest price point and the ads were sent to high income earners in the school district who have searched for realtors and any other numbers of relevant variables then the results were mostly whites I'd have absolutely no problem with it.

1