Contact Us

Use the form on the right to contact us.

You can edit the text in this area, and change where the contact form on the right submits to, by entering edit mode using the modes on the bottom right. 


123 Street Avenue, City Town, 99999

(123) 555-6789


You can set your address, phone number, email and site description in the settings tab.
Link to read me page with more information.

Market Update

A market update for Las Sendas, North East Mesa, and East Valley residents who want to keep an eye on the market!

Zillow, Redfin Etc. Are they Accurate? December 30, 2015

Connor Bearse

December 30 - Automated Valuation Models (AVMs) for homes appear to be multiplying quickly these days. I thought that it was about time we applied some Cromford® style reality testing to these, because most members of the public seem to assume these computer-generated numbers have some resemblance to reality. Just as a stopped clock is completely accurate twice a day, every AVM is going to be right sometimes. However each one often gives estimates that are wildly different from all the others. Some are starting to make very bold claims of accuracy, but what do these claims really mean? If a home has not sold in the past few days, there is no "right answer" to compare an estimate to. Even a full appraisal is only one person's opinion of value, and another appraiser will almost certainly disagree with the first one. Some of these estimates change dramatically from one week to the next. The real world does not behave like that.

I feel the public is being given a lot of data that in many cases will just add to the confusion and give a false sense that a correct answer actually exists.

To apply some mathematics to the problem I looked at 100 homes and examined their estimated values from 6 sources:

  • Zillow Zestimate
  • RPR RVM Value*
  • Monsoon Comp Based Valuation*
  • Monsoon FCV Ratio Valuation*
  • Redfin Estimate
  • Homesnap Estimated Value

(those marked * are intended for use by real estate professionals only)

Only Monsoon was successful in finding all 100 of the homes in the sample, while Zillow and Homesnap each failed to find 1 (not the same one). Redfin and RPR failed to find 4 homes due to incorrect address data in their master files, usually inherited from the incorrect address data at the Maricopa Assessor. In addition I found that Redfin and Homesnap both refused to give an estimate for all the homes that were currently active listings. That knocked another 6 homes out in each case. Homesnap also refused to guess on another 3 homes, meaning that (along with Redfin) it had the highest failure rate of 10%. To be fair though, Homesnap had the fastest response time, which endeared it to me while conducting the survey. Monsoon was the slowest to respond, but was by far the most flexible in allowing user adjustments to the properties used for comparison. I made no adjustments except in 1 case where there were no comparative sales within 4 miles, so I increased the range to 5 miles.

I took the average value from the 6 sources and compared each source to see if it appeared to be biased in any particular direction. One source stood out as an outlier in this test - Zillow's Zestimates were on average far higher than all other sources, and not by a small amount. The average Zillow Zestimate was 22% above the average of the other 5 sources. Of course it is statistically possible that Zillow is right, but it is also extremely unlikely. It is more likely that Zillow has decided that it will get more traffic if it applies significant bias to the upside. Home sellers love to believe their home is worth more than it really is. Zillow is (far more often than not) telling them what they want to hear. People love to be told what they want to hear, not what they need to hear (something cable news channels discovered long ago). Still I could be wrong about this, so I will continue to test more samples.

The other 5 sources were much closer together. Homesnap tended to be 5% above average, RPR 1% above average, Redfin came in at the average, while Monsoon Comp Base Valuations were 1% below average and the FCV Base Valuation was 5% below average.

This is not to suggest any of them were consistently "accurate" if that means anything. The variation between any one estimate and the average estimate could vary wildly for all 6 sources. Here are the maximum and minimum variations within this sample of 100:


This means there is a reasonable chance that the specific estimate you are looking at is more than 20% above or below the average value of the other 5 models. It is very likely if your specific estimate comes from Zillow.

As a statistician, I am tempted to talk about standard deviation, but I suspect the audience for this topic among real estate professionals is rather small. So I will resist the temptation. Deviation sounds like a bad word and you want as little of it in your samples as possible. Rest assured that the standard deviation in this set of values is much higher than you would like. The conclusion is that ALL automated valuation tools are very dangerous when given too much credibility. The human touch is important is estimating the value of a home, especially useful if that human has taken a good look inside and around the house before deciding what it might be worth in today's market.

Let me just cite one example among my 100 test cases: 6115 N 38TH PL in 85253

  • Zillow = $1,514785
  • Redfin = $968,388
  • RPR = $739,830
  • Homesnap = $725,000
  • Monsoon = $1,159,206 (comp)
  • Monsoon = $1,030,858 (fcvr)
  • Average = $1,023,011

With a standard deviation of 29% ($294,088) this is a highly inconsistent set of values.

Another problem I found is that many of the systems use incorrect addresses. For example Zillow has one entry for 3802 E BETHANY HOME RD and another for 6002 N 38TH PL. This is actually two address for the same home, which sits on the corner between Bethany Home Road and 38th Place. Only one is acceptable to the US Postal Service but many real estate databases contain the other address. Funnily enough, Zillow gives 2 different estimates for the one house, though the difference is only 2%. RPR has a habit of including homes that never existed. If there was once an MLS listing for a future home to be built, RPR acts as if that home is real, This is a bad assumption if that listing was from 2007 through 2009.

If I had to nominate the least biased of the 6 models I tested, then the newly announced Redfin Estimate would be my current pick, though the fact that it is missing a value for 10% of my sample was a big disadvantage.

Expect more on this subject in future observations. If you would like me to include other valuation tools in future tests, please feel free to email me with your suggestions.