Housing AI boom collides with retreat of anti-bias safeguards
AI is increasingly used to decide housing outcomes like loans and leases, but the Trump administration is rolling back long-standing disparate-impact protections, with HUD and the CFPB proposing rules that limit regulators’ ability to challenge biased algorithmic decisions. Supporters say looser oversight prevents overreach and lets math-based tools help lenders; critics warn this could normalize discrimination since AI models learn from biased data. The change leaves room for future administrations to undo reforms and keeps ongoing debates about how to police fairness in housing tech.











