AI ‘black box’ problem could expose agencies to legal risk

Estate and letting agencies could face mounting regulatory and reputational risks if they adopt artificial intelligence systems that make opaque, untraceable decisions, PropTech provider Reapit has warned.

The company says so-called ‘black box’ AI – tools whose outputs cannot be easily explained or audited – may breach new UK data-use and consumer-protection laws if deployed without transparency.
It is urging agencies to prioritise “explainable” and ethical AI models that show how decisions are reached.

AI now underpins a growing number of functions in the property sector, from automated valuations and listing generation to tenant screening and maintenance reporting.

OPERATIONAL EXPOSURE

Yet many of these systems, particularly those built on deep-learning frameworks, operate beyond the understanding of their creators. Analysts at IBM note that this opacity can create significant legal and operational exposure when decisions affect consumers.

Under the Data Use and Access Act 2025, agencies must disclose when AI is used in decision-making and allow customers to request human review.

Similarly, the Digital Markets, Competition and Consumers Act 2024 prohibits misleading property descriptions or AI-generated imagery that could misrepresent a home.

A recent case involving AI-enhanced property photos prompted consumer groups to warn that buyers and tenants risk wasting time and money visiting inaccurately presented listings.

AI ADOPTION SPEEDING UP

Despite these concerns, AI adoption is accelerating. McKinsey forecasts global use across real estate to grow by more than 40% by 2026, with PropTech investment expected to exceed €10 billion (£8.7 billion) a year.

Matt McGown (main picture, inset), Chief Product Officer at Reapit, suggests agencies weigh efficiency gains against compliance and trust,

He says: “The question isn’t whether agents will use AI, it’s whether they’ll use the right AI. Generic tools might save time, but they can also introduce risk.”

RISKING FINES

And he adds: “If your AI can’t show how it reached a decision, or how much it edited a photo, what information it used to draft a property description, or why it approved a tenant or prospective buyer for a viewing, you’re risking fines and your hard-won reputation.”

Reapit says its forthcoming Reapit AI (RAI) platform will use each client’s own data to deliver transparent, auditable outputs with human oversight built in.

McGown adds that the goal is to help agencies “move faster without compromising trust,” combining automation with accountability to ensure compliance and maintain consumer confidence.

Author

Top 5 This Week

Related Posts