top of page


How PropTech 2.0 Will Revolutionize Residential Real Estate Pricing

If you’re a homebuyer or seller, you’re familiar with aspects of property technology, or PropTech – for example, you may have used Zillow’s “Zestimate” to figure out whether your sale or offer price seemed reasonable. For housing-industry professionals, prop tech advancements thus far have offered varying degrees of useful and usable insight. Still, like any evolving technology, prop tech has gotten stuck and its flaws became glaringly apparent during the pandemic.

I’m here to tell you not to lose faith in it – instead, sit tight as better artificial intelligence (AI)-driven models roll out new ways of pegging housing values that will benefit every person in every touchpoint throughout the residential real estate industry.

Prop tech 1.0: Its promise and shortcomings

I’ve often written about how the most fundamental factor for buying and selling homes – the sales price – has long been based primarily on closing data for similar homes (comparables, or “comps”). I’ve mentioned that this information – although it was the best available – was problematic for several reasons, including that:

  1. Although residential real estate is the largest asset class in the US, valued at well over $40 trillion, the data that’s typically used to price it is latent by 30-90 days. You wouldn’t buy or sell stocks or gold today based on pricing from 90 days ago, would you? With housing, this was the best that the industry offered.

  2. There are infinite variables that can impact home values that have never been accurately reflected. For example, while conventional comps will factor in basics like square footage, bedrooms and baths, they don’t accurately account for variables like mechanical updates or schools.

A global pandemic and the housing-data meltdown

We’ve examined, ad nauseum, the impacts of the pandemic on market volatility and anomalies, so I won’t go into detail on that point today. What I will say, though, is that pandemic- and tech-fueled warp-speed changes to residential real estate markets across the country meant that the primary problem – latent data – led to vast pricing and valuation uncertainty in every crack and crevice of the industry.

How significant is this issue? Enough so that one of prop tech’s giants, Zillow, has suffered debilitating losses. Here are two key examples:

  1. Zillow “Zestimates”: Zillow led in the automated-valuation model (AVM) sector for years and there’s probably not a home buyer, seller or peruser who hasn’t looked at Zestimates – estimated valuations based on comps and other basic data (again, using comp data that was at least 30 days old and more likely 60-90 days or more). When demand for homes vastly outstripped supply during the first years of the pandemic, Zestimates were woefully behind the real-time market behaviors and best-guesses dominated, disadvantaging buyers and their agents.

  2. Zillow “ibuying”: With this venture, launched in 2019, Zillow scooped up homes with the intent to fix and flip and for a short time, Zillow was right. But then a confluence of problems, further fueled by the pandemic-driven frenzy, proved to be fatal to this otherwise straightforward game plan. As reported in November 2021 in The Wall Street Journal:

“The first quarter delivered home-sale profits that were more than twice as high as anticipated, the company said. Zillow expected to make money primarily from transaction fees and from services such as title insurance—not from making a killing on the flip. The company’s algorithm, which was supposed to predict housing prices, didn’t seem to understand the market. Zillow was also behind on its target for home purchases. By the summer, it had the opposite problem, the company later acknowledged. It was paying too much money for homes, and buying too many of them, just when price increases were starting to slow.”

As noted in that same Wall Street Journal article,

“Technology has in many ways transformed the hidebound real-estate industry. But Zillow ran into some of the limits of technology in a business still informed by emotional attachments, personal tastes and other intangible factors.”


As reported in Yahoo!Finance:

“Zillow was so confident in its pricing algorithm that it said its Zestimates would serve as the initial offer price on eligible homes. That didn’t last. The company announced last year that it was exiting the iBuying business. In a quarterly earnings call, CEO Rich Barton said Zillow was unable to correctly forecast future home prices amid volatility in the pandemic-driven housing frenzy.”

In a nutshell, Zillow couldn’t keep up with the rapid-fire changes in housing valuations in an extremely anomalous and volatile time and so, based on its algorithms, it paid too much for too long. When the data finally came in showing that the market was softening, Zillow had already overpaid, time and again, in overvalued markets (like a ripple in the ocean, even a tiny softening will have a huge impact across thousands of homes).

At its peak less than a year ago, Zillow had a market cap exceeding $48 billion. Today, it’s around $8 billion. Ouch.

Prop tech 2.0: The evolution to meet today’s real-time needs

Failures, whether Zillow’s or anyone else’s, are temporary and necessary: They reveal the cracks in our systems and show us ways to serve our clients and our industry better.

The errors in past AVMs are that they're not inclusive of enough data and they’re reactive, not proactive, in their information generation and usage. For years, both as the owner of a real estate brokerage and a market maker and trader, I could see these fundamental flaws and the need for improvement – to me, the latency in the data was always the industry’s primary challenge because residential real estate’s success is predicated on the notion of accurate housing prices.

Frequent readers of this blog know that I’m part of the leadership team of a new prop tech company called Plunk. Without getting into proprietary details, we’re taking the best of what we’ve learned from outdated AVMs and analytics, beefing up the data feed with information that is current, more accurate and far more detailed.

We’re looking at this $40-trillion-plus asset class – and how other assets interact with the pricing of housing – collecting all available data and putting it into AI models to come much closer to a real-time pricing model as well as well as a comprehensive database of all factors that could affect the existing dynamics of residential real estate. Simply put, Plunk’s model behaves similarly to how one looks at stock pricing. Although part of the data will still be fed by closed-sale information (which takes at least 30 days to report), the Plunk dynamic-valuation model (DVM) will also be inclusive of an enormous amount of real-time information and data that impacts housing prices, such as commodity trades, stock-market performance, housing condition, housing image scraping, bonds, location, school districts, interest rates and more. The list is extensive.

Think of it this way: Plunk is the first in the industry to apply Wall Street thought processes and pricing models to residential real estate. Here’s why that’s important: As I’ve said in past posts, I believe that the residential real estate market is completely correlated to other asset classes and valuations move as quickly as stocks. And when the available information – more accurate and transparent home-price valuations and analysis of residential real estate – are available through Plunk, everyone benefits:

  • Agents can list and show homes with confidence and clarity, relying on up-to-date information, rather than buffering out-of-date data with best guesses about market-specific changes and anomalies, including (as I used to call it) seller assist pricing

  • Sellers can more confidently list their properties, knowing that their homes are priced appropriately and accurately

  • Buyers can purchase more confidently, because they and their agents are using the best-available information to ensure a fair purchase price

  • Lenders, underwriters and insurers will be able to base their business on true market value of homes, rather than mostly on borrower-based criteria, which provides inherent benefit to them, as well as to potential buyers entering the market. This is a potentially seismic change in the marketplace - including institutional investors. Imagine being able to mark to market by the second?

And here’s one of the most significant opportunities: With real-time residential real estate valuations in hand and analytics, investment firms will have far more information available to manage portfolios that include residential real estate assets. This could lead to the creation of new tradable assets that can be introduced to the market. Subsequently, a real time hedge which is available in most other classes that trade by the second. So, even if you’re not an institutional investor, you could benefit through your retirement accounts or other investments (and if you are an institutional investor, this could be a golden opportunity). Until Plunk, there’s been no way to do this.

This DVM can also help potentially reduce or eliminate the boom-and-bust cycles of the past and the anomalies and uncertainties that plagued the industry over the last several years. It’s a vast improvement over the existing models for everyone at every touchpoint.

Plunk is working to provide everyone with the best information available so that you can leverage unprecedented opportunities, whether you’re a homebuyer or seller, an agent, in real-estate related businesses, an institutional investor or anyone in between.

To summarize:

Using breakthrough AI capabilities that collect and correlate data from dozens of sources, Plunk’s dynamic-valuation model (DVM) revolutionizes the $40+-trillion U.S. residential real estate market by providing valuations that are more transparent and accurate than anything previously available, leading to high confidence at every industry touchpoint and opening the door to unprecedented investment opportunities:

  1. The key challenge in residential real estate is pricing latency: Current models are built on limited data sources and rely on information that’s 30 to 90 days behind.

  2. The current industry standard – the S&P/CoreLogic/Case-Shiller U.S. National Home Price Index – compiles nearly two dozen indices that are primarily based on this latent data. The problems with this, as well as for other existing automated-valuation models (AVMs) such as Zillow’s “Zestimates,” are that they rely on latent data, they're not inclusive enough of data and they’re reactive, not proactive, in information generation and usage.

  3. While conventional housing comps factor in basics like square footage, bedrooms and baths, they don’t accurately account for the infinite variables that impact home values, meaning that this enormous industry’s value has never been accurately reflected.

  4. Plunk’s DVM uses data and information gathered from dozens of sources that impact housing including real-time movement in stocks, bonds and other asset classes; photo-scraping technology to factor in home improvements; major news events including geopolitical and economic shifts; increases and decreases in interest rates; swings in housing-related supply and labor costs; historic averages and anomalies; school-district ratings; geographic market-specific data; closing data; and more, all correlated to provide valuations that are more accurate and transparent than anything that’s ever been available.

  5. Plunk’s DVM brings the residential real estate industry and Wall Street closer together: If you buy a stock, you get up-to-the-moment pricing and analytics – and Plunk’s DVM brings Wall Street’s processes and pricing models and information to the residential real estate industry.

  6. Prior to Plunk’s DVM, investors with significant residential real estate holdings had no tool that enabled them to accurately and transparently value residential real estate assets to leverage opportunities, hedge investments, create derivatives and options and better manage portfolios.

  7. Similar to the way that Bloomberg captures and correlates multiple data sources to provide deeper and more accurate analysis of financial markets, Plunk’s DVM captures the correlations between residential real estate – by far the largest asset class, as well as the asset class that’s potentially the most sensitive to fluctuations in markets, politics and economics – and other asset classes in ways that no existing model has been able to accomplish.

Conclusion: The beauty and value of Plunk’s DVM is this: The more real-time data sources that are captured and correlated to value U.S. residential real estate, the more pricing confidence you can have, whether you’re invested in just one home or millions of them. Ironically, today the existing trading exchanges for residential real estate are the brokerage houses and real estate offices. Shouldn’t they have the same tools as the NYSE and NASDAQ?

The future is very bright and in real time!

8 views0 comments

Recent Posts

See All


bottom of page