欢迎访问《应用生态学报》官方网站,今天是 分享到:

应用生态学报

• 研究报告 •    下一篇

基于改进的投影寻踪的森林生态系统生态价位分级模型

吴承祯;洪伟;洪滔   

  1. 福建农林大学林学院森林生态研究所,福州 350002
  • 收稿日期:2005-01-31 修回日期:2005-04-25 出版日期:2006-03-18 发布日期:2006-03-18

Eco-value level classification model of forest ecosystem based on modified projection pursuit technique

WU Chengzhen;HONG Wei;HONG Tao   

  1. Institute of Forest Ecology,College of Forestry,Fujian Agriculture and Forestry University,Fuzhou 350002,China

  • Received:2005-01-31 Revised:2005-04-25 Online:2006-03-18 Published:2006-03-18

摘要: 针对投影寻踪技术计算过程复杂、编程实现困难等缺陷,提出采用改进单纯形法直接优化投影寻踪技术的投影函数和投影方向,从而简化了投影寻踪技术的实现过程,克服了目前投影寻踪技术编程实现困难、搜索时间长及投影方向难以最优化等缺点.提出了基于改进单纯形法的投影寻踪技术的森林生态系统生态价位分级模型.利用该模型可把各森林生态系统多维分类指标综合成一维投影值.投影值越大表示该森林类型生态服务价值越大.根据投影值大小可对森林生态系统样本集进行合理分级.实例分级结果表明,直接由样本数据驱动的改进的投影寻踪森林生态系统生态价位分级模型用于生态价位分级,简单可行,具有较强的适用性和应用性;可操作性强,其优化时间及投影函数值分别为传统投影寻踪技术的34%和143%;投影寻踪效果很理想,可广泛应用于生态学、生物学及区域可持续发展研究中各类非线性、高维数据分级与评价.

Abstract: To optimize the projection function and direction of projection pursuit technique,predigest its realization process,and overcome the shortcomings in long time calculation and in the difficulty of optimizing projection direction and computer programming,this paper presented a modified simplex method (MSM),and based on it,brought forward the eco-value level classification model (EVLCM) of forest ecosystem,which could integrate the multidimensional classification index into one-dimensional projection value,with high projection value denoting high ecosystem services value.Examples of forest ecosystem could be reasonably classified by the new model according to their projection value,suggesting that EVLCM driven directly by samples data of forest ecosystem was simple and feasible,applicable,and maneuverable.The calculating time and value of projection function were 34% and 143% of those with the traditional projection pursuit technique,respectively.This model could be applied extensively to classify and estimate all kinds of non-linear and multidimensional data in ecology,biology,and regional sustainable development.