(PECL sphinx >= 0.1.0)
SphinxClient::setLimits — 设置返回结果集偏移量和数目
$offset
, int $limit
[, int $max_matches
= 0
[, int $cutoff
= 0
]] )
给服务器端结果集设置一个偏移量 offset
从那个偏移量起向客户端返回的匹配项数目限制(limit
)
. 并且可以在服务器端设定当前查询的结果集大小
(max_matches
) ,另有一个阈值 (cutoff
),当找到的匹配项达到这个阀值时就停止搜索。
offset
结果集的偏移量.
limit
返回的匹配项数目.
max_matches
设置控制搜索过程中searchd在内存中所保持的匹配项数目.
cutoff
该设置是为高级性能优化而提供的. 它告诉searchd 在找到并处理 cutoff
个匹配后就强制停止.
成功时返回 TRUE
, 或者在失败时返回 FALSE
。
xxxbunker dot com (2010-03-21 22:52:03)
the max_matches / cutoff parameters are priceless.
if you ever have a situation where you need a 'count' of the number of matches, but only need to display lets say the 'top 10', these 2 parameters are very handy.
we used to get the occasional 'unable to connect' error with sphinx, after implementing these 2 parameters where applicable, these issues disappeared, load dropped, and the servers were much happier.
shoanm at users dot sourceforge dot net (2009-08-25 03:06:12)
I almost pulled out all my hair trying to figure this one out. After applying limits using
$s->setLimit(10,10);
the search kept returning only false. getLastError() and getLastWarning() contained empty strings.
The solution, like Nayana stated, is to add a positive non-zero integer $max to setLimit.
Nayana Hettiarachch nayana at corp-gems dot com (2009-01-23 08:36:08)
If you get an error
per-query max_matches=0 out of bounds (per-server max_matches=1000).
make sure that you also set the $max to a value other than the default 0,
there is an issue published with a patch if you feel like wanting to patch,
the first option works well as a workaround.
http://sphinxsearch.com/bugs/view.php?id=208