Making Data Work for Your Geospatial Challenges

Blog Administrator | Data Enhancement, Data Enrichment, Geocoding, hazard data, Hazard Hub, IP Location, Location Intelligence, Marketing, Property Data | , , , , , , , , , , , ,

Anyone who has looked at flood data knows that FEMA is not the answer. It’s not that FEMA doesn’t try to accurately map what they believe to be floodable areas. It’s that FEMA is politically driven. As a city, you do not have to participate in FEMA mapping. Why would you not participate? Well, properties in flood zones tend to have lower valuations, and lower valuations tend to generate lower taxes. Even as an individual you can exempt yourself with a LOMA. A LOMA establishes a property’s location in relation to the Special Flood Hazard Area (SFHA). LOMAs are usually issued because a property has been inadvertently mapped as being in the floodplain, but it is actually on natural high ground above the base flood elevation.

Melissa customer wanted to sell flood insurance to prospects that were in flood zones but not likely to flood and not in flood zones but likely to flood. They had three primary targets, properties in 100 year flood zones that were not likely to flood, properties in a 500 year flood zone that were not likely to flood, and properties in FEMA minimal risk zones that were likely to flood. The customer realized that they needed a way to understand the current FEMA designation for the target properties but also have an independent flood likelihood evaluation of the property. For them, we created a sample set of customers utilizing HazardHub risk data that looked something like this:


Then they created a targeted list by selecting B and C score prospects from the 100 year and 500 year flood plains and D and F score prospects from the minimal risk flood zones. This scoring and these selections are available nationwide and provided the customer with the ability to selectively target the types of customers that they were interested in from a risk exposure perspective. While this example discusses flood, this works for any natural hazard where properties are exposed, both personal property as well as commercial property. If you have data challenges to solve, perhaps the Melissa team can offer the location intelligence solution needed. Melissa supports geospatial professionals in the goal of mapping innovation in location-based services, analytics and decision-making powered by location intelligence. We provide a wealth of location data enrichments including global geocoding to derive latitude and longitude from an address, and geo-enriched data for IP addresses. We offer other types of specialized data including U.S. property and mortgage data (type and number of buildings on a parcel, property age, construction, sales value, and more), demographics (household income, marital status, residence data, credit information, and more), and risk and natural hazard information (wind, water, ground and wildfire) that can be linked to location data to reveal relationships and trends. Our data feeds easily into popular data visualization and analytics platforms for ease of use and up-to-the minute accuracy. When you need to solve global challenges with geospatial technology, turn to Melissa– your single, trusted source for authoritative reference data.

Written for Directions Magazine – Header Image via @directionsmag

The Seven Things Most Insurers Don’t Know About Hazard Data

Blog Administrator | hazard data | , , , , ,

Unless you spend everyday thinking about, creating, and updating hazard data there’s probably a lot you don’t know about it.  We were asked what we thought the seven top things that insurers didn’t know about hazard data were.  Here’s our list:

1) There’s a cost to not using the data. Most hazard data is sourced on a per transaction basis or on annual license based on anticipated transaction volume. In other words, data costs money.  But that cost is minimal compared to taking on unnecessary risk or underpricing a policy.  Very few underwriting departments examine the cost of making a bad decision to save a dollar or two on hazard data.  The thought process is almost always, what’s this report going to cost?  Not, what’s the cost of writing a bad policy?

2) Hazard data changes. The hazard data that you used last year to quote and bind a policy may not be the same this year.  Hazard databases are evolutionary not static. Some change annually, some quarterly, and some even monthly.  Events that happen or didn’t happen this month, quarter, year, will affect the update of the hazard data and can change the scores of your models. For example, a wildfire will take a high risk area to a low risk area literally overnight. It may take three to five years or longer for the vegetation to grow back to make it a high scoring area again.

3) The more hazard data the better. Typically companies only look at the hazard variables they are interested in. And that makes perfect sense. It’s also a little short sighted. By acquiring hazard data beyond the basic needs an insurance company can start building up a history of scores associated with a property and find out if other variables are predictive for them and understanding the likelihood of a claim being filed.

4) Hazard data should be used to educate the consumer. You may recall the Farmer’s commercials that showed how to prevent claims.  They were humorously teaching consumers how to mitigate risk. Underwriting and Claims should be leading the charge on mitigation.  Hazard data reports provided to the consumer can go a long way in educating the policyholder and reducing claims. As odd as this may seem, very few consumers know their true hazard exposure.

5) Exposure analysis and underwriting should work hand in hand. Some people look at the big picture, some people look at the small picture.  Macro versus micro.  But the underlying hazard data to look at both pictures should be the same. When you have differences in data, or versions of data, different departments can be sending different messages throughout the company.

6) Using hazard data does not have to be hard. Through the use of LandVision Insurance and APIs, as well as geospatial files, data delivery can be tailored to meet your specific needs. Companies can also vary their hazard data delivery by department, with some departments receiving geospatial files, some reports, and some the API.  The easier it is to us hazard data the more likely it is to get used.

7) The more you use hazard data the better your loss ratios become. Knowledge is power according to Sir Francis Bacon. And the more you know, the more you know when it comes to accurately writing, quoting and binding a policy. Greater knowledge at the beginning of the policy process equates to better policies written which correlates to better loss ratios.

If you came up with some other answers, please send them our way.  Clearly there are more than seven things that aren’t typically known about hazard data, but we stopped at seven.  But without getting into all of the technical details of sourcing, update schedules, modeling techniques, etc., the key point is, if you have access to hazard data, use it.
If you don’t have access, get it. And that the more data you have the better your decisions will be.

Melissa and Runner EDQ Expand Partnership to Deliver Data Quality to Oracle Users

Blog Administrator | Uncategorized

Oracle Validated Integration Offers Oracle Customers Rapid Access to Clean, Accurate Address and Contact Data

Rancho Santa Margarita, CALIF – November 14, 2017 Melissa, a leading provider of global contact data quality and identity verification solutions, today expanded its partnership with Runner Technologies (Runner EDQ), a leading provider of integrated address verification solutions and Platinum level member of the Oracle Partner Network (OPN). Together the two firms deliver CLEAN_Address®, an Oracle-validated, integrated address verification solution for PeopleSoft Enterprise, JD Edwards EnterpriseOne, and E-Business Suite platforms.

To achieve Oracle Validated Integration, Oracle partners are required to meet a stringent set of requirements that are based on the needs and priorities of Oracle customers. With this integration, CLEAN_Address® can meet the needs of users to capture and maintain accurate address and contact data in a broad spectrum of markets, including healthcare, finance, travel and transportation, government, and higher education. Joint customers include Simon & Shuster, McKesson Corp., and the U.S. Small Business Administration.

“Today’s data-driven enterprise demands enterprise-grade data quality infrastructure – this is essential for successful master data management, data integration, and data aggregation,” said Bud Walker, vice president, enterprise sales and strategy, Melissa. “The collaboration between Melissa and Runner EDQ meets this need with the integration of postal address verification, email verification, and other data enhancements, ideal for systems such as human capital management, student information, financial, vendor management systems, and CRM platforms.”

“Achieving Oracle Validated Integration gives all our customers the confidence that integration between Clean_Address and Oracle is functionally sound and performs as tested,” said Greg Marinello, CEO, Runner EDQ. “By combining efforts and capitalizing on Melissa’s global toolset and authoritative reference datasets, we’re empowering Oracle users to directly engage with data quality, enabling them to cleanse, update, and enrich their existing and incoming data throughout the data lifecycle.”

CLEAN_Address also includes an API for custom development, allowing users to create business rules that enforce individual data quality standards. Click here for detailed information on CLEAN_Address or to access a free trial. To connect with members of Melissa’s global intelligence team, visit or call 1-800-MELISSA.

HazardHub Predictions of Hurricane Harvey Show Significant Risk Compared to Competitors

Blog Administrator | hazard data, Hazard Hub, Property Data | , , , , ,

Exact numbers for the devastation caused by Hurricane Harvey won’t be in for quite some time, but the storm surge damage estimates are nothing short of extraordinary. Melissa’s partner, HazardHub, the only third-generation provider of property-level hazard risk databases spanning the most dangerous perils in the continental United States, released a shocking calculation before the 2017 storm ever reached Texas.

HazardHub predicted that risk from the storm, depending on the hurricane’s Category ranking, could be double that of its competitor’s predictions. HazardHub warned that 367,000 homes could be affected if Harvey became a Category 4, and that $77 billion of property was at risk. A report from the balance, after Hurricane Harvey hit, stated that total repairs will cost $180 billion and that 203,000 homes were damaged, thus far. Competitive estimates only indicated $30 billion in property damage and between $48 and $75 billion in total losses.

Although estimates for the number of homes at risk were higher than actuality, HazardHub aired on the side of caution and in the spirit of being best prepared for storm surge risk – as the company believes that everyone who owns a home or business should know the risk around their property and prepare as best they can.

Bob Frady, CEO of HazardHub, described it best, “For too many years, people have been caught by surprise by storm surges that caused massive damage to homes and businesses. We want to help eliminate that surprise by providing cutting-edge tools that help to estimate risk from – and allow people to prepare more fully for – storm surge.

HazardHub converts very large, very complicated data sets of long-run hazard event data into risk grades for natural disasters such as flood, wildfire, lightning, hurricanes, tornadoes, earthquakes and more, for every home in the contiguous U.S. You can check an address or latitude/longitude point, one at a time or in batch, to get detailed hazard reports on an entire mailing list, customer file, area or region.

Use HazardHub To:

  • See detailed reports of natural hazard risks in any area
  • Reach out to potential customers who could use targeted
  • insurance services
  • Pinpoint exact hazards at the address level for granular risk assessment
  • Give the most personalized insurance quotes by knowing which services your customers need
  • Determine which customers can best use services for mitigation, new roofs or storm preparedness

Comprehensive, accurate risk data analysis of home and business natural hazard risks help people better prepare for natural disasters, understand what services they need or don’t need, and enables individuals to make better real-world decisions.

HazardHub Web Service at