• Skip to main content
  • Skip to footer

DataMaster for Appraisers

The Right Data | The Right Time | The Right Decision

  • In The Weeds Blog
  • Product
    • Available MLS Markets
    • CompTracker
    • DataMaster Express
    • Neighborhood Analysis
    • Office Workfile Sharing
    • Patents
    • Plat Maps
    • Pricing
    • Smart Field Technology
    • Supported Appraisal Software
  • Support
    • Contact
    • 1004MC Form Explanation
    • Training & Tutorials
    • Frequently Asked Questions
    • System Requirements
  • About Us
    • Testimonials
    • DataMaster Company Overview
    • Our Team
    • Referral Program
    • Upcoming Events
    • What DataMaster Does
    • Work at DataMaster
  • My Account
  • Try DataMaster Today

Data

The 3 Characteristics of a Reliable Data Source

August 25, 2020 by McKissock Leave a Comment

By: McKissock

This article was originally published on the McKissock Learning blog.

McKissock Image. Do not use other than for McKissock article.

As a real estate appraiser, there are numerous data types and sources available that you could use to solve a specific appraisal problem. In order to reasonably believe that a source is reliable, it is recommended that you assess the dependability of a data source based on the following three characteristics, at a minimum.

1. Accuracy

Is the data source known to be typically correct, factual, consistent, and precise? Is the data required to be entered in a uniform format? In other words, a data source such as the MLS may require certain data points such as site size to be entered in a consistent and specific data format.

Illustration

An assessor’s office has recently been found by a governmental oversight committee to have produced property records that are neither accurate nor uniform in an effort to lessen the tax burden on those well-connected to various local community leaders. This finding has been well-reported in the local market’s news sources. In this case, the assessor property records would be considered to be an unreliable source by an appraiser until the assessor’s office has remedied the discovered concerns about the property records.

On the other hand, an assessor’s office that is known to employ a state-of-the-art property data gathering system that is verified on a biannual schedule by an independent source and discloses all records such as sketches, property ratings, and calculations which allows the public to replicate the assessor’s work may be deemed as a reasonably reliable source by an appraiser.

2. Trustworthiness

Is the data source considered to be objective, impartial, and credible?

Illustration

Exaggerations pertaining to a specific outcome may compromise or alter the neutral perspectives of the data. For example, relying on property details as listed in a “For Sale by Owner” listing may be determined to be unreliable data as the author of the data (the homeowner/seller) may exaggerate positives of the property or under-report negative concerns of the property, all in an effort to sell the property quickly and at the highest price point possible.

3. Timeliness

Is the data from a time period suitable for the appraisal’s effective date? How often is the data updated? Data would be measured on timeliness if it is a dynamic record rather than a static record. For example, if an appraiser was confirming the year built of the subject dwelling and found that it was built in 1955, she could rely on this data even though its reporting date was 10 years ago, since the year built does not change over time. However, the appraiser would not want to rely on MLS data that is 10 years old for determining the property’s current condition, modernization, and quality for an appraisal with a current effective date. The subject home may have undergone remodeling or renovations over the past 10 years, or the subject dwelling may not have been maintained over the past 10 years, either of which could drastically impact the appraisal assignment results.

Illustration

The accuracy of some data may decay over time. For example, data about the subject property’s condition, interior finishes, and modernization are just a few items that may significantly alter over time. Thus, as a precautionary measure to limit liability to the appraiser and in an effort to adequately inform the client, the appraiser should disclose the effective date of sources of data—especially data points that are impacted by the passage of time.

As an illustration, perhaps an appraiser is relying on an aerial image of the subject property captured three years ago. The appraiser should indicate the effective date of the satellite image in the appraisal report, as the subject property may have been altered in the past three years in a way that negatively or positively impacts the subject’s marketability or value.

Take a deep dive into appraisal data with our FREE Pro-Series webinar on Wednesday, August 26, 2020, “The Data Puzzle: Pulling It All Together.” In this session, you’ll learn about various data sources, how to verify information, and what data should be kept in your workfile. Reserve your spot.

For over 30 years, McKissock Learning has partnered with appraisers from around the country to bring you not only the highest quality courses, but also the practical knowledge and hands-on skills to take your appraisal career to the next level. Learn more at McKissock.com/Appraisal.

Filed Under: Data

Neighborhood Analysis in a Nutshell

June 17, 2020 by Shauna Hatch Leave a Comment

Neighborhood view from above.

Do You Know Why and How?

Neighborhood Analysis is something that every appraiser uses in developing their appraisals. But how many of them are using it correctly? How many of them understand it? As an appraiser, do YOU understand why you should thoroughly grasp how Neighborhood Analysis work? If not, keep reading and we’ll share some insights about this important part of the appraisal process! We draw on a webinar by The Appraiser’s Advocate, Tim Andersen, DataMaster’s Chief Appraiser, Jared Preisler, and hosted by DataMaster’s chief marketer, Marty Hamilton.

Why Should You Understand?

Working on a neighborhood analysis.

It’s common knowledge that all appraisers use tools to complete their appraisals. But do all appraisers understand those tools, why the tools are used in specific circumstances, and how the tools actually work as they provide data to an appraiser?

As Jared explains:

Advisory Opinion 37 deals with computer assisted valuation tools. It can be the sketching tool, the laser measure, the multiple regression analysis tool, etc. To conform with Advisory Opinion 37, appraisers need to understand the tools that they’re using. As an appraiser uses Neighborhood Analysis, they need to understand how it works and what type of data makes it work. Not every tool will work in every market, as there are regional nuances, etc.

As a successful appraiser, it’s imperative that you understand the ins and outs of the tools that you use! This need is illustrated with an example from Tim.

Tim shared a story about some appraisers who bought some regression analysis software, started using it without understanding it, used the software inappropriately, and got called before the state board. The appraisers were chastised by the board and instructed that it was their responsibility to understand the tool better (and any other appraisal tools that they used!). They needed to do their part and learn how the tool worked before using it to come to conclusions in an appraisal that they signed their names to.

Tim gives appraisers three tips when considering a tool to use in the appraisal creation process:

  1. Know what the tool is and what it does.
  2. Know how to use it.
  3. Understand if the tool really helps you answer the question.

Do you doubt that you thoroughly understand the tools that you use? We’re here to help, and today we’re discussing Neighborhood Analysis with the intention of giving you the understanding of how the tool works in DataMaster and how it will help you create a defensible Neighborhood Analysis!

What Neighborhood Analysis Does

Neighborhood Analysis Graphs
Some of the Neighborhood Analysis Graphs

Anytime DataMaster’s new Neighborhood Analysis tool is discussed in our office, Jared’s excitement is clearly evident! As our Chief Appraiser, Jared is heavily involved with developing our appraisal tools and is extremely excited about this newly released tool!

Neighborhood Analysis by DataMaster has been meticulously developed and tested by Jared along with our development teams and other appraisers who work in our office, including our president and CEO, Rick Lifferth. Trust us, this tool is exactly what any successful, driven appraiser is looking for because it was developed by active, wildly successful appraisers! (For more insight into DataMaster, hop over to “Let the Appraiser Decide” and learn more about what DataMaster does for appraisers!)

Why is this tool so exciting to Jared and the other appraisers in our office? Because the tool keeps the appraiser in control of the data, saves them even more time on each appraisal, and packs impressive capabilities that benefit you, the appraiser. Let’s delve into those capabilities! We’re sure you’re going to be impressed!

The Details

Summary of market conditions data. Neighborhood Analysis Comparisons.
Summary of market conditions data in Neighborhood Analysis

Neighborhood Analysis by DataMaster works as the appraiser feeds their data into DataMaster. The appraiser defines the market to the tool. DataMaster then segments and compares the two contrasting datasets, whether they are from a tight competing market or a broader neighborhood, or two-stories versus ramblers, etc. The tool allows the appraiser the flexibility of mimicking whatever the market is.

One example of how this would be used if if you have a new builder that decides to build a new project and that project is using the exact same models that the builder built in a different area. The older area is selling right now, so an appraiser might have to compare the two competing markets to each other to come up with a location difference. Jared exclaims:

This tool would be dynamic, it would be fantastic for me to use it here! I could bring the two markets up, compare them, and show what the location difference is!

Next, DataMaster presents the comparison between the two markets. The comparison will show one or two years of data month to month or up to six years quarterly.

The appraiser then compares the information, and with DataMaster’s help, analyzes the data. One quick note, the appraiser can import all of the data that they want to import, there’s no limit!

Neighborhood Analysis Search Criteria Summary
Neighborhood Analysis Search Criteria Summary

Neighborhood Analysis by DataMaster was built to GSE standards, yet another benefit of this extremely helpful tool. Once the appropriate data is chosen by the appraiser, the tool fills out the required forms and provides meaningful graphs (28 to choose from!) and statistics. The appraiser then customizes the reports, saving time and creating a solid, defensible file which is automatically saved for future reference.

Do you need a little more convincing? Jared Preisler praises Neighborhood Analysis by DataMaster and explains why it’s so important!

This tool will help the appraiser make better trending decisions. DataMaster gives the appraiser all of the information that they need, presented in a way that’s better for the appraiser to digest than is presented elsewhere. Also, when the appraiser hits “Send to Report,” all of the information in the Neighborhood Analysis gets saved in the appraiser’s archive! Many appraisers don’t save their Neighborhood Analysis, a USPAP violation, so having the analysis automatically saved is a huge plus for appraisers!

And that’s how Neighborhood Analysis by DataMaster works! We’re so excited for this new tool, and we know that you’ll love it! It’s amazing!

A Little Something About Neighborhood Analysis

Woman in White Dress

With the heavy-duty lifting done, we’d like to give you a bonus tidbit, an insight from Tim on how to ensure that your Neighborhood Analysis is thorough.

Taking the time to investigate environmental forces can pay off in the creation of an appraisal report that misses nothing. You need to think of every little thing that might have an impact on a property, including ghosts!

Years ago, Tim was asked by a friend to help him appraise a commercial property in Key West, Florida. The property was a gorgeous B&B on nearly an acre of land. Myriad plants and trees dampened most of the noise around them, other than the occasional airplane passing overhead. The main building was built post-Civil War out of high-quality oak that had superbly withstood the test of time. The venerable home showed off beautiful, high ceilings and quality workmanship. More buildings had been added to the property over the last century to create more traditional hotel-style rooms.

Not only was the building unique and magnificent, the property surrounding the B&B had been declared a botanical sanctuary by the University of Florida.

There was much to see and discuss as the B&B manager proudly showed Tim and his friend the lush grounds.

After touring and conversing a while, Tim off-handedly asked if the property was haunted. Surprised that Tim would ask, the manager said that the original builder of the home actually still lived there and haunted her bedroom! The ghost reportedly folded clothes in her old bedroom while the guests were out! This strange environmental factor was actually a plus for many guests, bringing customers to the B&B with the hopes of a glimpse of the specter!

Tim’s point in relating the story was that as an appraiser, you should ask questions, dig into the nuances of the local area. You must know the history, what businesses are coming in and bringing jobs or going out and taking jobs, if the comps that were previously used are still relevant, if the population is aging or on the young side, and are there oddities such as ghosts that might draw in visitors (or chase them away), etc.?

Wrapping It Up

Tim’s ghost story is a good example of the depths that a good Neighborhood Analysis must go to. Just remember, DataMaster is here to help! We’re always working on improving the work process for appraisers. Neighborhood Analysis is something we’re so excited about (if you can’t tell). Along with CompTracker, we feel like we’ve outdone ourselves this year in offering appraisers tools that will propel them forward by enabling them to complete their appraisals faster, thoroughly, accurately, and defensibly! Remember, we develop our tools with explicit input from active, successful appraisers who are involved with the creation, development, and fine-tuning of the finished product! These tools are developed for you, the appraiser!

We appreciate Tim Andersen for taking the time to join Jared Preisler and host Marty Hamilton for our Neighborhood Analysis webinar, which this blog post is based on. If you’d like to see their discussion in more detail, go to the webinar link here: https://event.webinarjam.com/channel/Neighborhood.

To get Tim’s ebook go to: TheAppraisersAdvocate.com/product or email him at: Tim@theappraisersadvocate.com.

Thanks for reading, and Happy Appraising!

Filed Under: Appraisal Business, appraiser information, Data, DataMaster Software, Resources for Real Estate Appraisers Tagged With: How to do a Neighborhood Analysis, Neighborhood Analysis, Neighborhood Analysis for Appraisers, What is a Neighborhood Analysis

The Big Black Box; The Difference Between Big Data and Big Analytics

June 12, 2018 by Marty Hamilton Leave a Comment

He stood before the three men and one woman with a bead of sweat creeping down his left temple.  It was not that their presence was particularly ominous in appearance. After all, they were seated on metal folding chairs behind a plastic banquet table.  No, his trepidation was more about what they represented and even more so, about what their decision could do to him and his career.

Data Gathered“Mr. Smith [names have been changed],” she began, “your cost approach does not seem to make sense.  You claim Marshall and Swift as your source of the data, but what you have in this report does not seem to have any workfile support.  How do you explain that?”

Mr. Smith knew that the next thing out of his mouth could dramatically affect his future as a real estate appraiser, so he paused and chose his words well.  “Well ma’am. I don’t quite know what to say. I plugged the numbers into the computer program correctly. What you see there is what it spit back out. I just assumed it was right.”

Technology can be a powerful tool.  It can also be a deadly poison. So it can be said about the ominous “big data” that we often hear about.  The difference? It comes down to how the tool is used, how it is understood, and how the appraiser might be involved (or not be involved as the case may be).  

In my years of mentoring other appraisers to be more successful business owners, there is a common theme I have observed; most of us do things the way our mentor taught us to do them, and the same way we have always done them.  For those who were taught correct principles and techniques, this is not a bad thing. For others, who may have been ‘taught’ by the seat of someone else’s pants, there may be some improvement to be had.

Though we have all been taught and have practiced the fundamentals of appraising, we sometimes rely too much on the all-might-computer-thingy and too little on that soft, squishy, grey thing between our ears.  Data, numbers, and statistics can slip into that same realm. We plug the numbers in a program and an answer is spit back out. That is the very definition of a “black box,” when you have no idea what happened between data input and result output.  

Relying on an ominous black box or sets of data we do not understand has never worked and will become less and less defensible into the future.  The world is changing and the continent of appraising is especially changing.

Many appraisers are fearful of their future.  Fear is a great motivator but a poor problem solver.  It is good to recognize that the profession of valuation is and will be different into the future.  “Three comps and you are done,” is no longer acceptable. The way we approach valuation will continue to evolve (some would say devolve) in the next several years.  What our clients need or want is not the same as it has always been. I predict the 1004 will either disappear or be completely remodeled in the near future. Furthermore, like it or not, big data is becoming more and more important to our appraisal processes.  

The amount of data that is gathered and distributed on a daily basis is mind-blowing.  Numbers tell a story, but they often do not tell the entire story. This is where the valuation expert (read appraiser) comes in.  While computers are getting better and better at interpreting data, they should never replace good old fashioned common sense and an expert who lives, works, and understands the local market.  

It is time to reinvent yourselves, appraisers!  It is time to show our clients that the black box approach is not enough.  They need us. We need to embrace big data, but do so with a cherry on top.  We are the cherry. We are the difference between Mr. Smith above and a professional, valuation expert.


Dustin HarrisDustin Harris is a successful, self-employed, residential real estate appraiser. He has been appraising for over two decades. He is the owner and President of Appraisal Precision and Consulting Group, Inc., and is a popular author, speaker and consultant. He also owns and operates The Appraiser Coach where he personally advises and mentors other appraisers helping them to also run successful appraisal companies and increase their net worth. His blog is read by over 20,000 appraisers nationwide and he facilitates several appraiser membership groups both online and in person. His free podcast is listened to by thousands of appraisers each week and can be downloaded on iTunes and Stitcher Radio. He and his wife reside in Idaho with their four children.  He loves playing in the outdoors and watching movies indoors.

 

Filed Under: Data

Accuracy of Property Data – Important as Ever?

November 13, 2017 by admin 1 Comment

This is a guest post authored by Robert T. Murphy

I have had some discussion lately with valuation industry participants regarding the importance of the appraiser obtaining accurate property data as well as understanding its source for comparable properties considered in an appraisal. These discussions are the result, in part, of the addition of Advisory Opinion #37 (AO-37) to the Uniform Standards of Professional Appraisal Practice (USPAP) coming in January 2018.

A lot has been written about AO-37 and I have no intent to rehash the Opinion other than to say that it certainly addresses the issue that the credibility of any analysis incorporated into a specific tools output, in part, depends on the quality of its data. It is important to note that AO-37 isn’t concerned with just regression type tools but also, as addressed in an illustration, those tools that automatically input information from an MLS. It is also clear that the appraiser must be the one controlling the input.

Government Sponsored Enterprises (GSEs) Fannie Mae and Freddie Mac are also concerned about the quality of property data. Whether its referred to as “data integrity” or “data accuracy” the end meaning is the same. Both GSEs currently have analytical tools to assist their customers in analyzing appraisal reports. Fannie Mae has Collateral Underwriter (CU) while Freddie Mac’s offering is Loan Collateral Advisor. While these are two independent unrelated systems, both attempt to identify underlying collateral risk by analyzing data integrity/accuracy issues.

In Fannie Mae’s Collateral Underwriter training – Understanding CU Risk Scores, Flags, and Messages https://www.fanniemae.com/content/recorded_tutorial/collateral-underwriter-risk-score-flags-messages – they specifically address the underlying factors considered in the development of the CU Risk Score. One of those factors is Appraisal Quality which in part includes Data Integrity. The following excerpt is directly from their training material available on their website.  Appraisal Quality Components - Fannie Mae

 

They also go on to state:

“First, when we look at Data Integrity, we’re looking to determine if the physical attributes and transaction terms of the subject property and comparables are accurately reported. This is the bottom of the pyramid here because it is the foundation for any appraisal report. We all know the old saying “garbage in, garbage out”. If the subject or comps are not accurately represented, it can influence our judgment of the entire appraisal report. It affects whether or not comps look similar to the subject, the direction and magnitude of the adjustments, and which comps should receive most weight in reconciliation.”

With respect to Freddie Mac’s Loan Collateral Advisor they state in their publication Understanding UCDP Proprietary Risk Score Messages, which is available on their website at http://www.freddiemac.com/learn/pdfs/uw/ucdp_riskscores.pdf, that there are two risk scores indicated, one being a Valuation Risk Score and the other being Appraisal Quality Risk Score. The publication further provides the following:

“Appraisal Quality Risk Score:

  • Is a risk measure pertaining to the appraisal quality, data accuracy, and completeness of the appraisal report. The lower the risk score, the lower the risk of a significant appraisal quality defect or deficiency.
  • Assesses multiple components of the appraisal to help you determine whether appraisals adhere to our standards and guidelines.
  • Is influenced by many factors, including the integrity of the appraisal data, the relevance of the comparable sales selected relative to the pool of available sales, and the reasonableness and supportability of the adjustments.”

What is an appraiser to do?

As we all know the availability and quality of property data varies greatly throughout the country. Taking into consideration the preceding discussion on USPAP, Fannie Mae, and Freddie Mac I believe it is incumbent upon to the appraiser to ensure they have access to the most complete and accurate sources of data available for the geographical areas in which they complete appraisals. AO-37 makes it clear that it is important not only to control input data but to also to have an understanding of the source data.

Use of Multiple Listing Services (MLS)

With the growth of and consolidation/mergers of MLS’s, along with technological advances over the years, they are without a doubt one of the main sources of property data and information for appraisers. Aside from the usual input errors which can occur, the accuracy of some MLS data may differ depending on the format in which it is accessed. Lets take a brief look at some of these formats which are File Transfer Protocol (FTP), Internet Data Exchanges (IDX), and Real Estate Transaction Standard (RETS).

One of the earliest  methods of property related data delivery was FTP which was used to transfer files from one host to another. It, for years, was the standard for transferring property data files. One issue with FTP is that there are no set standards between MLS’s which means it can become costly. The data transferred is somewhat straight forward – meaning that many fields are not included. Additionally, in order to update an individual file, the entire file is moved all at once. This means that the entire database is moved in bulk each time the records need to be updated – certainly not efficient and can be very time consuming. As a result updates tend to be less frequent.

Many of us remember the MLS books – I have many of fond memories perusing the pages of those books for comparables and leads. Fast forward – that same information and then some is now available on multiple websites. That was made possible by IDX which is the data exchange between an MLS board’s database and a broker/realtor’s or other entities website. For the first time, it was easy for the public to access large numbers of homes for sale. As home buyers became more Internet-savvy, IDX evolved to provide more options. Agents and brokers wanted more than the basic search products their MLS offered, so a new raw IDX data feed option was created. Raw data feeds allow agents and brokers to download all available IDX listings in a simple format (like an Excel spreadsheet) that could be used with custom home search applications. For all of the advantages, one of the major issues with IDX data is that it tends to be incomplete regarding certain fields such as sales history and status’s.

RETS is another method which was developed specifically for the Real Estate Industry to request and receive data.

RETS is used to give brokers, agents and third parties access to listing and transaction data. MLS’s nationwide have moved to adopt RETS as the industry standard because it drastically simplifies the process of getting  data from an MLS to an agent/broker’s site or a third-party vendor. RETS allows the user to customize how the data is to be displayed. It also provides the most currently  available data in that it is constantly updated.

RETS provides the user with data that is both more accurate and easy to work with. RETS data is designed to match each MLS’s unique practices so it typically resembles the field names and option values more closely to what is seen when logged into the MLS itself. RETS is also easier for third-party vendors to troubleshoot than a FTP feed, so any issues can usually be diagnosed and corrected quickly.

Overall, RETS data tends to be extremely robust, typically contains most fields (including the property sales history and status’s) and is easy to access and support. It also makes it possible to get specific pieces of data (rather than an entire/large file) and data can be accessed on demand at any time of the day. Only the data that have been recently added or changed will need to be updated. This means that the data can be refreshed more regularly, thereby providing more current  data.

If MLS data is being used for anything other than providing listings for public website display, RETS is the best method for MLS data access and retrieval. Additionally, since access to the MLS’s RETS server requires an agreement outlining the permitted use, fees (if any) and redistribution provisions of the data, it can be, and is, used for various purposes.

However, one of the main stumbling blocks with RETS  is that the data feed itself is impossible to use without other software  to understand  it.

This is where a company such as DataMaster comes into play. For those who have never heard of DataMaster, their system allows the RETS data to be converted and transferred into an understandable format. This in turn allows appraisers to complete appraisal form reports quicker and more easily while using the most accurate data available. With the appraiser in control of comparable selections, their Patented process provides the ability to simultaneously download MLS data as well as Public Record information into form reports while being UAD compliant. This allows appraisers to save time and money when it comes to completing their reports.

All of this is possible because DataMaster actually goes out to local MLS’s through the “front-door” and sign contracts allowing them the access and use of data directly from the RETS server for that particular MLS. What separates DataMaster from other vendors in this space is that as a result of their relationships with local MLS’s there is no “gray data” or questionable permitted use of data involved – you can trust the information is the most accurate, timely and properly formatted information available from the MLS and Public Record.

Conclusion

Having spent approximately 40 years in the valuation industry I like to, from time-to-time, take a look back specifically to see what was being taught then, which honestly set a foundation for me. While contemplating this post I took a look at The Appraisal of Real Estate, Seventh Edition (1978) by the American Institute of Real Estate Appraisers and came across the following sentence in Chapter 15 – The Market Data Approach: Principles:

“Since no conclusion is better than the quality of data on which it is predicated, the appraiser screens and analyzes all data to establish its reliability and applicability before using it as the basis for a value indication in the market data approach.”

So, the answer to the question of the importance of accurate data is an unequivocal “yes” it is indeed as important as ever!

About the Author:

Robert T. Murphy is a senior valuation executive with 40 years experience in the valuation industry. Currently Mr. Murphy is President of Collateral Advisors LLC an independent consulting firm focusing on a wide range of valuation related issues including but not limited to valuation techniques, policy, and compliance as well as data integrity and technology related issues. Mr. Murphy was most recently  Director of Property Valuation and Eligibility at Fannie Mae.

(DataMaster USA is a client of Collateral Advisors LLC)

Filed Under: Data

Footer

765 E Gordon Ave
Layton, UT 84041

801.657.5769
support@datamasterusa.com
DataMaster saves appraisers more than 60 minutes per report by importing data from the MLS and public records directly into their form software.

Read more

Menu
  • Frequently Asked Questions
  • In The Weeds Blog
  • Register
  • My Account

© Copyright 2021 DataMaster · All Rights Reserved