We audited 65 real estate websites across 13 platforms… AI search is a problem.

Ryan Darani
Ryan Darani
Ryan Darani
Ryan Darani
Co-Founder at FlyDragon

Ryan runs FlyDragons' AI SEO operations. With over a decade of organic search under his belt.

Summarize with AI

I run FlyDragon’s AI SEO strategy and AI infrastructure.

We work with 100+ real estate agents across the US and Canada, with clients on something like 20 different website platforms. Every sales call opens on the same question: is my website AI-friendly? 

For six months I've been answering it anecdotally but enough agents asked that I decided to run the audit properly: five live sites from each of the 13 most-deployed real estate platforms, 65 sites total, scored against the 2026 standards for what makes a page citable by AI search engines. 

This research ranks what I found, tells you what I think it means, and gives you the questions you can (and should) take to your vendor.

Why AI visibility is a different problem than SEO for websites

Traditional SEO was about ranking on page one of Google. Answer Engine Optimization (AEO) and Generative Engine Optimization (GEO) are about being the source an AI model cites when it writes, ranks and prioritizes an answer by itself.

When a buyer in Nashville opens ChatGPT and types who is the best real estate agent in East Nashville under $700k, the answer has nothing to do with yard signs or Zillow reviews. It depends on which agent's website an AI system can read, parse, and trust enough to recommend.

Being citable in AI means five things have to be true about the page (technical version):

  1. Core content is present in plain HTML. OpenAI's OAI-SearchBot and Anthropic's Claude-SearchBot don't render JavaScript the way Googlebot does. If your bio, service area, and neighborhood guide isn't present when the AI search bots come to your page, nothing is seen.
  2. Structured data is implemented in JSON-LD. RealEstateAgent, LocalBusiness, Person — these are the schema types that tell an AI system who you are rather than making it guess from unstructured prose.
  3. AI search crawlers are allowed at the robots.txt layer and the CDN layer. These are two different checks — and 27% of B2B sites fail the CDN one without knowing (ziptie.dev via Mersel AI, February 2026). GPTBot and ClaudeBot are training crawlers and don't determine citation. OAI-SearchBot, Claude-SearchBot, and PerplexityBot do — plus the real-time user fetchers ChatGPT-User, Claude-User, and Perplexity-User. Most platform teams I've talked to don't know the distinction exists.
  4. Core Web Vitals are in the green. LCP under 2.5s, INP under 200ms, CLS under 0.1. Slow pages get deprioritized by AI Overviews the same way they get deprioritized by classic ranking.
  5. Entity signals are consistent. NAP data, mentions across the web, and disambiguation ("Sarah Chen, Douglas Elliman, Upper East Side") that let the AI resolve you to one specific professional.

And now, so you can understand what each thing means:

  1. Your content needs to be visible on the page to search robots. If the language your website is built on isn't robot-friendly, it won't be seen (and thus, won't be ranked).
  2. Structured data is robot language to help understand what the page is about in a structured way instead of having to read the headers and all the content (which burns tokens).
  3. Robots.txt is a file most real estate website providers will have by default. However, the wrong instructions here can inadvertently block AI search engines from accessing your content.
  4. Core web vitals means how quickly your website loads. It's a metric that Google introduced some years ago so we focused on more than just site speed.
  5. Entity = your brand, where it appears and the information about your brand is consistent.

Most real estate platforms were built before any of this mattered. A few have caught up. Most are still selling IDX speed and CRM automation as if it's 2021.

How me and the team ran the website audits

I pulled five live agent sites from each of the 13 most-deployed platforms in the US market — 65 sites — and ran each through a headless crawler mimicking how an AI search system fetches a page. 

I checked the raw HTML that arrives pre-JavaScript, looked for JSON-LD schema in the document, tested robots.txt directives against named AI search user agents, and recorded the response code. 

Every site was scored on five criteria: 

  • HTML readability
  • Schema markup
  • AI search crawler access
  • JavaScript rendering risk
  • Composite readiness

Three notes on methodology before the scorecard. A 403 response isn't automatically AI-hostile — some CDNs return 403 to any request they don't recognize, and named AI search crawlers with verified user agents may pass through the same middleware.

But a 403 against my crawler is at minimum unverified access for AI crawlers, and unverified is itself the problem. Second, schema judgments came from raw page source, not platform marketing claims. Third, the Bold Trail finding is a robots.txt-level block — categorical, unambiguous, not up for interpretation.

The best real estate agent websites ranked for AI search

Here's how the 13 platforms ranked, best to worst*.

PlatformHTML ReadableSchemaAI Bot AccessJS RiskOverall
AgentFireYesActiveOpenLowBest
Sierra Interactive403 blockedConfirmedUnverifiedMediumMixed
Agent ImageYesWP-dependentOpenLowMixed
BoomtownYesNot visibleOpenMediumMixed
BrivityPartialNot visibleOpenMediumMixed
Luxury PresencePartialNot visibleOpenMed-HighMixed
Curaytor403 blockedClaims AEOUnverifiedUnknownMixed
CINCYesNot visibleOpenLowMixed
Ylopo (site varies)DependsDependsDependsVariesMixed
Lofty403 blockedUnknownUnverifiedUnknownPoor
Market Leader403 blockedUnknownUnverifiedUnknownPoor
Real Geeks403 blockedUnknownUnverifiedUnknownPoor
Bold Trailrobots.txt blockUnknownBlockedN/AWorst

Eight of thirteen platforms are somewhere in the middle — readable but under-built for AI citation. Four are unverified on access and haven't publicly addressed it. One is actively opted out. And one is genuinely ahead of the pack.

Which is the best real estate website for AI search?

AgentFire — the only platform shipping features for AI

AgentFire has a structural advantage: raw HTML is served to crawlers without waiting on a client-side render, plugin ecosystems cover schema cleanly, and robots.txt is agent-configurable rather than vendor-locked. 

What tipped AgentFire to the top of the scorecard wasn't the WordPress-esque base. In March 2026 they shipped a schema update adding Article schema for blog content, Product schema for listing pages, and VideoObject for video widgets — explicitly framed as Google and AI platform readiness. Of the 13 platforms I audited, they're the only one publicly treating AI citation as a product feature rather than a side effect.

AgentFire is where I'd send a new agent tomorrow for AI visibility.

Sierra Interactive

Sierra's documentation confirms schema markup is applied automatically to listing pages, agent profiles, and contact pages. 

Their reputation for traditional SEO is earned — the Sierra sites I've worked with rank genuinely well on Google. And Sierra has a great team of people who genuinely care about the experience they give agents.

But every Sierra site I crawled returned 403, and the collateral-damage problem applies: their aggressive bot protection may be catching AI crawlers in a net it was built to catch scrapers with. Sierra's support team is good. 

If you're on Sierra, send them the question list below and get the answer in writing.

Agent Image

Agent Image also builds on WordPress, giving it the same structural access as AgentFire. 

The sites I reviewed loaded cleanly, with strong hyperlocal content organization (floor plans grouped by community, neighborhood landing pages) and raw HTML without JavaScript dependencies. 

Schema depends on the individual build, because Agent Image is a custom platform rather than a templated one. Of the middle-tier platforms, Agent Image has the most workable foundation — an agent on Agent Image who layers in schema and entity work will beat an agent on AgentFire who does nothing.

Boomtown

Boomtown sites loaded clean HTML over Fastly's CDN. Property search is heavily JavaScript-dependent, putting the bulk of the lead-gen surface behind a client-side render — meaning to an AI crawler, the listings effectively don't exist. No schema visible. 

Boomtown is a lead-gen platform first; AI citation isn't a documented product feature and the architecture reflects that.

Brivity

Brivity sites loaded core page content in raw HTML (good) but showed JavaScript-dependent navigation, authentication flows, and IDX integration that relied on client-side rendering (same problem as Boomtown).

From first-hand experience, working with agents who use Brivity, I can tell you that performance is hindered by the platform. We’ve deployed a similar approach to AI search for most of our 100+ agents and the platform which gives us the greatest issues?

Brivity.

Luxury Presence

Luxury Presence prioritizes bespoke design and brand-forward aesthetics, and the agent sites on the platform are the prettiest in the sample. Sites loaded with partial content in HTML but heavy JavaScript dependency on Cloudinary-loaded assets. 

The bigger problem sits in property search, which runs on JavaScript frameworks AI search crawlers don't render. No schema on the listing pages I tested. 

The brand work is excellent. The AI citation readiness is not. Despite claiming they offer ‘AI SEO’ as a service, I highly suspect Luxury Presence is simply using it as an add-on, rather than a core offering.

Curaytor

Curaytor's platform page says their sites are "SEO and AEO ready at launch" — the most forward-facing AI visibility language of any platform in this audit. 

Yet every Curaytor site I crawled returned 403 (this means the site is blocked from AI search engines).

The bot-protection middleware that blocked me may or may not whitelist named AI search crawlers; I have no way to tell from the outside. 

I'd put the probability that Curaytor's middleware correctly allows OAI-SearchBot, Claude-SearchBot, and PerplexityBot at 50/50, and I'd want to see server logs before I moved an agent onto the platform on that claim alone. 

The intent is there. The proof isn't public. And, from what I’ve seen with traffic, most of their offerings do not support AI search.

CINC

CINC sites loaded cleanly with neighborhood guide content in plain HTML. Then I checked the page titles. "Home Page." Multiple sites. Multiple states. Basic on-page SEO fundamentals that should have been solved in 2014 aren't. No visible schema markup. CINC is lead-gen-first, and SEO and AI visibility are clearly not where product engineering attention has been spent.

Ylopo

Worth clarifying because agents shop for Ylopo as if it were a website builder. It isn't. Ylopo is an advertising, IDX search, and CRM overlay that sits on top of a Squarespace site — every Ylopo client I've ever audited has been on Squarespace underneath. Ylopo's AI visibility ceiling is whatever Squarespace delivers (readable HTML, no native real estate schema, no llms.txt, limited structured data). Ylopo can't compensate for the underlying platform's gaps because it isn't touching that layer.

Lofty

Lofty (formerly Chime) returned 403 across every site I tested. Whether this affects AI crawler access specifically is unverified, and the unverified status is the problem. If you're on Lofty, you have no way of knowing whether your site is reachable by OAI-SearchBot or Claude-SearchBot. Product investment has been in CRM and workflow automation; organic visibility has never been the headline.

Market Leader

Market Leader (CoStar-owned) returned 403 across the sample. Built as a lead-generation machine — pay-per-lead pipes, turnkey conversion — AI visibility isn't a documented feature. I'd put probability of passing a competent AI-citation audit at under 30% without custom intervention.

Real Geeks

Real Geeks sites returned 403. Popular budget-friendly option with decent IDX integration and lead capture, but its SEO flexibility is well-documented as weaker than WordPress. AI visibility status unverified. If you're a new agent on a budget and Real Geeks is the cheapest IDX option, fine — but don't expect the platform to do AI visibility work for you.

Bold Trail

Bold Trail is the worst result in the audit and the gap is wide. Sites returned ROBOTS_DISALLOWED rather than 403 — a robots.txt directive instructing every crawler that reads robots.txt to stay away. OAI-SearchBot, Claude-SearchBot, PerplexityBot, Perplexity-User, ChatGPT-User, Claude-User — all of them honor robots.txt. 

An agent on Bold Trail has been opted out of AI visibility at the infrastructure level, by a policy they didn't write, which they probably don't know exists. I'd put this at 95%+ probability that no Bold Trail agent reading this has ever been told. 

Which makes it the biggest finding in the audit — a category-level exclusion rather than an execution gap.

Join BulletProof

Join BulletProof (joinbulletproof.com) is a Texas-based done-for-you marketing service, not a website platform. They sell "AI-ready schema, voice search optimization, and AEO" alongside a Google local SEO package. I'm including them here because several agents I've spoken to this year were weighing them against us specifically for AI visibility work.

The public record on BulletProof is unusually thin for a company selling $399–$1,800/month packages on 12-month contracts. No Trustpilot profile. No G2 profile. No Sitejabber profile. No BBB accreditation and no customer reviews on their BBB listing. No Reddit threads.

Their Facebook page shows "Not yet rated (1 Review)." The 5-star testimonials that do exist are self-hosted on their own domain or pulled from self-submitted ProvenExpert ratings.

Every case study I could verify was real, but every case study was selected by BulletProof and posted by BulletProof.

The one third-party analysis of their work is a detailed 2025 review by Robert Newman at InboundREM. Newman is a competitor, which I'll stipulate upfront.

He flags what he describes as domain-rating inflation via low-authority directory links — a tactic that looks good on Ahrefs or Moz dashboards until a Google core update rolls the scores back. He also flags service dependency and dated site foundations.

I can't independently verify his link-quality claim from the outside, and I haven't audited a BulletProof client site at the link level. His technical critique is consistent with what I'd expect from a package built around "50+ directory profile creations" as a headline deliverable.

The AI SEO layer is the part I can assess directly. Stripped of the terminology, BulletProof's package is Google Business Profile management, directory submissions, 15+ monthly templated posts, Local Service Ads guidance, and a CRM. Directory citations and GBP posts don't produce AI citations.

What produces AI citations is original content, entity-consistent structured data across the open web, and subject-matter authority that resolves to a specific person in a specific market. BulletProof's deliverables don't meaningfully generate any of those.

The "AI-ready schema" is real schema — it'll validate — but schema on a site with thin original content and high-volume low-authority backlinks isn't what AI search engines are citing from.

The local SEO work may well produce leads. GBP is still a real channel and LSAs still convert in some markets.

I wouldn't recommend them for AI visibility specifically, and the lack of any independent review record on a 12-month contract is a structural risk I'd flag to any agent considering the spend.

7 questions to ask your website provider

I've watched twelve agents in the last year have the AI visibility conversation with their platform's support team and I'll save you forty minutes. The seven questions that matter — copy them, send them, get the answers in writing:

  • does our robots.txt explicitly allow all AI search bots to access our website?
  • does any bot-protection middleware (Cloudflare, Imperva, AWS WAF) whitelist named AI search user agents at the CDN layer
  • is RealEstateAgent schema applied to agent pages automatically, in JSON-LD, and does it validate in Google's Rich Results Test
  • is core page content (bio, service areas, neighborhood guides) present in raw HTML pre-JavaScript
  • is the property search on the main domain or a subdomain
  • what are our current LCP, INP, and CLS numbers in Search Console

That's six.

The seventh is the one that matters most: who internally owns AI visibility on your platform, and can I talk to them?

If the answer is "that's not a role we have," you already have your answer about the platform. If the answer names a person and they respond within a week, that's a real signal about where the product is heading.

If you're unsure, book a call with us. It's no obligaiton. We'll tell you what's going on.

llms.txt is NOT important for AI visibility

One quick detour, because every AI-visibility piece aimed at real estate agents in 2026 is recommending llms.txt.

I don't.

Google's John Mueller and Gary Illyes have both publicly said no Google AI system consults llms.txt.

Multiple large-domain analyses have found zero statistically significant correlation between llms.txt presence and AI citation frequency. 784+ sites have implemented it, and server logs from the ones publishing their data show AI crawlers aren't fetching the file at meaningful rates. The proposal might become a standard in 2027 or 2028. It isn't one now. If your platform ships llms.txt support, fine — take it.

If you're paying a consultant to implement llms.txt, you're paying for cargo cult.

Your website platform is the floor

The platform is the foundation of AI visibility, nothing more.

You still have to build the building on top of it.

Even AgentFire has gaps. And a Nashville agent on CINC who does the right schema, the right entity-building, the right content cadence will outrank a Nashville agent on AgentFire who does none of that.

Reach is the first hurdle. Citation is the second, and the second is harder. AI citations favor sources that are authoritative (mentioned and linked across the web, not only on their own domain), entity-rich (clearly identified as a specific person in a specific market with specific expertise), fresh (updated with market-relevant content), and structured (schema and clean HTML so machines parse without guessing).

This is the work I do.

FlyDragon layers AI visibility infrastructure on top of whatever platform an agent is on — building the content, the entity signals, the citation trail that gets you named by ChatGPT and Perplexity when a buyer in your market asks. After 100+ agent partnerships across 20 platforms I can tell you the pattern clearly: the platform is one variable of maybe twelve.

The 403 blocks and the Bold Trail finding represent a systemic failure across an industry that hasn't caught up to how search works in 2026. The agents who win the next five years are the ones who figure out that being findable by humans and being citable by AI are two different problems with two different solutions — and who stop confusing the first for the second.

Most of the industry is still doing the first and calling it the second.

*This review is not personal or subjective to any of the real estate website providers mentioned. The information is true as of 20th April 2026 based on the live audits we conducted of each platform. These opinions are not swayed or influenced.