One of the most warped narratives in the debate over artificial intelligence policy is the notion that algorithmic technologies will somehow exacerbate market discrimination or competitiveness problems. It is twisted logic because, in many cases, the opposite is likely true: AI technology gives us the best chance in years to correct human biases and improve marketplace competition and consumer outcomes.

Consider housing rental. Al technologies can help create products and services to offer people and businesses market options. Some government officials have a different take and say algorithmic applications must be preemptively regulated or banned due to concerns about theoretical discrimination or competitiveness harms.

For example, the  Justice Department and a handful of state attorneys general recently filed an antitrust lawsuit against RealPage, a provider of property management software, claiming that the firm facilitated collusion among landlords to inflate rents. RealPage’s AI-enabled software helps property owners generate pricing recommendations for rental units. A bill has also been introduced in Congress prohibiting using algorithmic systems for leased or rented residential dwellings. Finally, San Francisco city officials recently passed an ordinance banning the use of “algorithmic devices” to recommend rents.

Policymakers are responding to recent price hikes in rental markets, which have been climbing rapidly in recent years. Analysts report that rent inflation is 80 percent higher during the Biden administration than during the previous administration. This is a serious burden for the nearly one-third of Americans who rent and devote half of their income to this monthly bill.

Blaming AI systems for rent inflation is misplaced. What these policymakers are really at war with are the laws of economics. Like every other sector, housing and rental markets are subject to supply and demand, ultimately determining how much the public will pay for where they live. In this case, housing economics are severely distorted by decades of direct and indirect public policy interventions, often crafted with the best intentions.

Blaming landlords or AI software companies for the nation’s high housing costs is classic political deflection. Policymakers are avoiding the root causes of a complex problem exacerbated by an unwillingness to accept the fact that housing affordability has been most affected by the many barriers to building dwellings in many communities.

 Zoning regulations, property taxes, tariffs on construction materials, environmental restrictions and other policiescombine to limit supply and raise prices. Experts in the field note “huge empirical literature that documents a positive correlation between regulation and average or median house values.”

Broader economic issues affect housing and rentals, including interest rates and labor and migration policies. Meanwhile, the COVID crisis and the subsequent rise of more remote work in response to it also had a jarring effecton housing and rental markets nationwide.

In other words, AI-enabled software is not the primary problem. Using antitrust or other regulations to limit technological capabilities could also undermine innovations that could help solve housing problems by opening up new business models and pricing plans.

The Justice Department has alleged that AI-enabled pricing tools inherently create more significant harm because “this new frontier poses an even greater anticompetitive threat than the last.” But nothing of the sort has been proven, and, by essentially demonizing new AI capabilities, the agency threatens to hold back innovations that could have many benefits, including letting us better determine if consumer harms actually develop.

The government uses algorithmic pricing tools when it uses dynamic pricing on toll roads and certain utilities. Moreover, dynamic pricing is a part of many other sectors of our economy, including airlines, hotels and ride-sharing, among others.

The fact that an algorithmic tool is used to adjust pricing in these cases is irrelevant, and, in reality, those technological capabilities help expand services for the public that should be welcomed, not banned.

Policymakers should focus on proving actual harms as they find them, not trying to shoot the technological middleman who provides an essential service with real benefits.