In Part I of this series we looked at the problems and challenges facing the MLS community of today.
In Part II we defined the problems in greater detail and examined the systemic and institutional monopolies that perpetuate those problems.
In this installment we will look at a possible solution, one based in technology not in governance or management/ownership structure, one that presents benefits for all parties in the MLS ecosystem.
The climate is ripe for innovation in the MLS technology industry. Advances in data storage capacity, exponential increases in interconnection speed, the advent of the “app store” approach for mobile devices, and the growth in services that compete with the MLS for the agent’s attention and tech-spend all point to an urgent need to redefine the traditional vendor-MLS relationship.
The industry has long dreamed of a method whereby the MLS, and subsequently its subscribers, would not be hamstrung by the need to select a single vendor, one that controls not only the database but also all of the applications that use the database. Under the moniker of “front end of choice,” the industry has pursued standardized data definitions, query/response methods and transport protocols.
The concept of an open MLS that would allow any number of user interfaces (front ends) is not new. It was discussed as long as 15 years ago when I was still a novice product manager at Interealty Corp. (later to become part of CoreLogic after multiple ownership changes). The concept was revisited in 2008 when Saul Klein (then of Internet Crusade, later Point 2 and points beyond 2) published his MLS 5.0 manifesto. In his paper, Saul said the new MLS must be “open, collaborative, self-organizing and self-policed.” He added, “MLS needs to redefine itself from a purely business-to-business network tool to a marketing facilitator for its participants and subscribers. It needs to take advantage of its assets and shift its paradigm from (just) information about what is for sale to information on all property whether for sale or not.”
Seven years later, the concept remains unfulfilled. In the absence of advances in MLS technology, others have stepped in to fill the vacuum. National websites that combine listing data with property data, demographics, lifestyles, and user generated content and ratings have proliferated. But the MLS world now has an opportunity to catch up and take the lead in a big way. The stars are aligning and many pieces of the puzzle are starting to fall into place.
Data Standards – Finally
Progress has been slow, but lately great strides have been made by the Real Estate Standards Organization (RESO) in codifying data norms through the Real Estate Transaction Standard (RETS). RETS has been used for years to standardize the way data is distributed from the MLS to licensed recipients. Now, through adoption of the Data Dictionary and the pending release of the RETS-API, RESO is pushing the industry toward more internal standards and therefore more interoperability.
A few MLSs have gone so far as to embrace the RETS Update Transaction that will allow brokers to upload (and maintain) listing records into the MLS from the Broker’s intranet, rather than the MLS’s front-end interface. This transaction standard has made possible the much-anticipated Upstream project, which will (as best we understand it at the time of this publication) aggregate new listings from participating brokers and feed them TO the MLS (as well as to other syndication destinations) from the broker’s back office, rather than the reverse. Upstream has the potential to be the first, and most widely used, front end of choice for a new generation of MLSs willing to embrace it.
The nation’s largest MLS, California Regional (CRMLS), has already announced they are ready to receive the listing input feed from Upstream, using technology supplied by Atlanta based Bridge Interactive Group. The time is right to explore how much farther we can push the standards and how much closer we can get to achieving not just front end of choice but ALL tools of choice. A change in backend database architecture opens a whole world of possibilities.
The Open Database concept
In order to facilitate greater technological innovation, the industry needs to overhaul the current system starting with the database and get rid of the legacy of closed proprietary repositories. Technology has advanced to the point where keeping the database proprietary and inaccessible is no longer necessary for a well functioning integrated system.
By separating the database (the back end) from the other elements of the system (the front ends) and opening access to the database through a collection of application program interfaces (API) and software developer kits (SDK), by contributing the APIs to a common-good licensing system and the software code that drives them to an open source repository, access to and open use of the database would be available to any technology partner with whom the MLS enters into a license agreement.
In such an environment, the database remains the core of the system. But it is not inextricably linked to a fixed set of front-end applications, those tools that an agent uses to do business. Any tool, written by any developer, that follows the published guidelines for access and connectivity (the open APIs) could query for data and receive back, in real time, from a live database, exactly the data that it asked for and nothing more. Just think of the possibilities:
- Agents could choose from many different versions of core MLS functionality by picking from a catalog of Hotsheet apps, CMA apps, Buyers’ Tour apps, and on and on. Agents pay only for what they use, and in many cases (certainly not all) lower their monthly fees.
- Most of the broker complaints listed by The Realty Alliance would/could be addressed.
- Products and services are unbundled and open to free selection
- Products that compete with a broker’s service could be masked from agents in that firm
- Data feeds would be standardized among all systems on the database
- Broker back-office integration would be greatly simplified
- Minimal downloading would mitigate most data piracy
- MLS conversions would never again be needed. Instead of changing the whole system to get new functionality for its subscribers, the MLS would simply license new application providers. The marketplace would sort out the good from the bad.
- Data downloading and synchronization would be a thing of the past.
- Data distribution management would be all but eliminated because except in a few large-scale cases the applications would not download all the data from the MLS.
The ramifications and possibilities that follow this initial step are myriad, complex and full of potential. We will discuss some of those later in Part IV of this series, but first we need to get past the initial obstacles to this concept.
Previous roadblocks to progress
Why has no one tried this before? There are many reasons, some large, some small, but all difficult to overcome.
No MLS vendor or MLS operator has the financial wherewithal (either resources or incentive) to create such a new ecosystem from scratch. Many are hard pressed to keep pace with the massive financial investments being made by the online portals that they see as a major competitive threat, let alone divert more funds to a speculative and potentially disruptive venture. The costs of such a start up, with no guarantees of a return on investment, are simply too large and too risky for any single entity to take on.
As with nearly everything else in the MLS business, politics play a major role. Vendors do not want to alienate their current customer base in the pursuit of larger opportunities. MLS executives are already fearful of job security in the face of growing pressures to consolidate and regionalize. Elected leadership is most comfortable keeping the ship on an even keel during their short term in office, rather than charting new waters in Oceania Incognito. No one wants to take the first step or even make the first suggestion of stepping out in a new direction.
In doing the research for this project, I spoke with many MLS leaders, both vendors and CEOs. Without exception, all were intrigued by the concept but almost every one of them on both sides of the equation expressed fear of going first, of even being the first to publicly express interest.
I also spoke with a number of technology companies that might be able to provide some of the pieces of the puzzle, the elements that would be necessary to complete the entire deconstruction and reassembly of a new system. All (except for one we will discuss shortly) that I spoke with felt they could make contributions. But like the incumbents, none of them wanted to be the first one tapped as a leading contributor.
Objects and MLSs at rest remain at rest unless acted upon by an outside force. So said Sir Isaac Newton in talking about the first MLS. (Actually I think he was describing planetary orbits, but the concept is similar.)
MLS executives have plenty to worry about without some industry consultant coming along and telling them they should expand their horizons and think of a new way of structuring their technology base. The market is emerging from years of gloom and doom and as market activity increases so do the inquiries and complaints from competing practitioners. Rules violations have to be attended to. New complaints from large brokers about past MLS business policies come under fire. NAR considers new association charter requirements and the execs must determine the impact on their local membership.
But some proposals need to be considered, regardless of the noise competing for the executives’ attention. The future of their MLS and their industry rely on constantly moving forward and innovating, changing direction when needed and never remaining at rest. The inertia of the MLS of the past must at some point change or the MLS may perish.
The Proposal – What it is and how it works
There must be a better way. My research examined the current state of MLS technology and my conclusions posit a new technology structure that could address the current chaos to the benefit of all parties involved.
Such a proposal requires a reconsideration of the MLS at its very core – its underlying platform. I believe there is an opportunity to deconstruct the MLS, to separate user interface, business logic, and data layers of the platform into separate structures. Such deconstruction would enable the MLS to become far more open and flexible. It would create a framework for technological innovation and the opportunity for more choice by end users and vendors alike.
In order to open the architecture of the MLS, we must first break apart what has up until now been a closed, locked system. We need to split the database away from the applications that are used to access it, make it a standalone storage engine with all of the business rules but only the minimal structural rules and requirements needed to support real-time retrieval and management of the data.
At the same time we must create a tool set of Application Programming Interfaces (APIs) and Software Development Kits (SDKs) to allow access by any and all developers willing to conform to terms of a license and the established standard structure. Throughout this process we need to ensure that the database technology provider openly licenses the APIs through a common open-market system and makes them available to any and all interested parties at no charge. The primary database technology vendor must openly publish all the common source code needed by all applications through the APIs. These APIs allow development of a wide variety of new applications that can inter-operate against and across any database that is structured in the same open manner.
The deconstructed MLS also serves to minimize many of the conflicts between the parties in the industry today. In the process it would redefine the meaning of one of the continuing criticisms of the MLS – that it levels the playing field by charging large participants for the cost of supplying services and support to smaller ones. The new level field would present equal opportunities for all with equal access to the building blocks in a “pay only for what you use” economic model.
Such a dramatic shift in the technology landscape will not occur overnight. We will need to overcome the usual objections – this is too much; too new; you’re moving too quickly; not the way we’ve always done it. But baby steps now will help us reach the ultimate goal – not just front-end options but the MLS tool box of choice.
The philosophy behind the change
We anticipate not only a change in the technology used but a change in the philosophy behind an MLS database. Currently, MLS systems create a new data record for each and every listing contract entered into the system. This creates a considerable amount of duplicate data for record fields that do not change often (beds, baths) if ever (address, lot size). Most systems also auto-populate each new listing record with data stored in a public records (tax) system and do it again each and every time a property is listed. This increases the number of duplicate fields and the quantity of redundant data.
The new system envisions a property-centric database structure with each physical property having one property record, whether or not it has ever been transacted through the MLS. When a broker signs a new listing agreement, 99% of the data needed to complete a traditional listing record will already be stored in that property record, subject to verification and update if needed.
To this property record, an agent needs to add only a few fields to “claim” the property (thank you Zillow!) and indicate that it is now actively for sale: list date (when the listing will show as for sale), Listing Contract Date (when the listing agreement was executed), List Price (asking price), the identity of the listing agent (which would be pre-populated based on the login of the person making the entry, but subject to modification if that agent is merely helping another), and the cooperative compensation offered to other brokers. Thus each new listing event becomes an edited change to the master property record, not a new record in and of itself.
At time of activation, the agent could, optionally, add new photos, a description of the property (remarks), private showing instructions to other agents, virtual tours, or any other marketing information needed. Later, the agent would be able to update the status of a listing as it moved through the sales process.
The history of any changes, additions, deletions of fields or data within the property record would be religiously maintained so that no data would ever be lost. It would always be available to the MLS administration for rules enforcement purposes and to authorized governmental oversight bodies, as required by law.
This system would practically eliminate the most frequent and most troubling “tricks” that agents try in an attempt to game the system for marketing advantage. Agents would no longer be able to deactivate a listing and add it again as a new record. There is only one record, with a history that shows such activity.
Agents would not be able to create duplicate listings for the same property in multiple areas or zip codes. There is only one record in one location. (There would be provision for multiple treatments of that property, for example showing it for sale and/or for rent at the same time.) And perhaps most important, having a single property record with 90% or more of the content standardized and unchanging from listing to listing would eliminate the vast majority of typographical errors since data entry will be minimized for each listing.
Overcoming initial fear
The first reaction from the MLS vendor community to such a proposal is likely to be fear and trepidation. Opening the database to “outsiders” would undermine the very foundation of vendors who have stood the test of time immemorial – or at least since the late ‘80s when the computerized MLS network became a reality. Vendors won’t stand for that.
Or will they? What if we could show that an open system could actually expand market opportunities for vendors, increase their customer base, and make them more money by providing more of their applications than the current system, and not only in those market areas that are under their “control”?
The model being proposed here is not unlike that of the app store developed by Apple and mimicked by Google’s Android marketplace, where an open operating system is available to any and all developers, including the current major vendors. This ecosystem has proven both wildly successful for developers and widely popular among the end users. Such success probably surprised even Apple, which since its inception had been a closed proprietary operating system – the exact antithesis of the IBM-PC system with an open OS and open arms to any and all programs written to it. Thus did Apple mature and build on the success of iPhone® with the introduction of iPad® and soon other iGadgets.
So we urge the vendor community to consider this approach with an open mind and contemplate the endless possibilities.
In the course of my initial research I talked to a number of MLS vendors. I wanted to get their reaction to such a change in technology – was this a valid pursuit? Would it improve the industry and the participants equally, as I had imagined it would? Would they be interested in being considered as the principal technology vendor for such a project?
Reactions to these questions were wide and varied from mildly amused to wildly supportive. The smaller vendors who were struggling to compete were very interested in learning more. Medium to larger vendors were interested, but not too concerned that this might present a threat to their continuing business. And the one system vendor with great longevity and respect in the industry pooh-poohed the idea saying it would never work.
Having now proposed a radical transformation of the MLS infrastructure, our only task remaining will be to find someone to build it. Now that we have deconstructed the MLS, and proposed a path toward reconstruction and resurrection, we will complete our journey and identify a solution provider in Part 4.
This post first appeared on Procuring Cause blog.