not going to go back and find it to prove it ......but i was asking it some engineering calculations a while back and it provided some factually inaccurate mathematical results ....i understand it may not be " programed " adequately for doing mathematical equations but it did represent its answer as factual .......so i dont think i had a bias on a mathematical answer....point is that it represented its answer as correct, theoretically if i took its answer as factual and built a bridge and it fell killing people ........who is copiable.... me ..the programmeror the machine ....Have you considered the possibility that it gave you the correct answer, but that your own bias' prevented you from seeing it as such?
i foresee doing compex time consuming calculations of all types are certainly within the future pervue of AI