Copyright © 2017 ISEIS. All rights reserved
Geospatial Information Diffusion Based on Self-Learning Discrete Regression
When studying a phenomenon on the earth surface, such as natural disaster, water pollution and land use, the data in some geographic units may be insufficient. Most interpolation models cannot estimate missing data because they rely on continuous assumptions, however most geospatial data is not continuous. In this article, we develop an information diffusion technique, called self-learning discrete regression (SLDR), to infer the missing data of the gap units. To show how to use the suggested model, a virtual case based on flood experience in China is studied, where flood losses of the gap units are inferred with background data: population, per-capita GDP and relative exposure of the unit to flood. To the case, a comparison shows that SLDR is obviously superior to geographically weighted regression (GWR) and the back propagation neural network (BP network), reducing the error about 60% and 33%, respectively. To substantiate the special case arguments, ten simulation experiments are done with pure random seed numbers. The statistical average results show that the validity of GWR for filling gap units is doubtful, and SLDR is more accurate than BP network.
Keywords: information diffusion, geographic unit, regression, flood loss, simulation experiment
- There are currently no refbacks.