PHP Script Optimization - MySql Search and Sort -
i have developed simple script search database , sort results based on search terms used, trying relevant first.
now ran fine on local machine , before put in sorting ran okay on web server have hired once sorting went in search times have increased on webserver.
what i'm posting below have optimized as know how, i'm looking in better sort algorithm , maybe better way of querying database speed sort times!
now information i'm working needed allow searches of 3 letters or more example cat or car , couldn't change natural search word length limit mysql server can't use natural language searching of mysql hence why doing queries have.
also average search can return anywhere between 100-15000 results databases holding around 20000 entries
any appreciated
<?php require_once 'config.php'; $bringtone = true; $asearchstrings = $_post["searchstrings"]; $cconnection = new mysqli($dbhost, $dbuser, $dbpass, $dbname); if (mysqli_connect_errno()) { exit(); } $stables = array("natural", "artificial", "musical", "created"); $aqueries = array(); foreach ($stables $stable) { $squery = "select filename, downloadpath, description, imagefilepath, keywords `$stable` where"; $sparamtypes = ""; $aparams = array(); $icount = 0; foreach ($asearchstrings $sstring) { $sparamtypes .= "ss"; $aparams[] = "%,$sstring%"; $aparams[] = "$sstring%"; $squery .= $icount++ == 0 ? " (keywords ? or keywords ?)" : " , (keywords ? or keywords ?)"; } array_unshift($aparams, $sparamtypes); $aqueries[$squery] = $aparams; } $aresults = array(); foreach ($aqueries $squery => $aparams) { if ($cstmt = $cconnection->prepare($squery)) { $aqueryresults = array(); call_user_func_array(array($cstmt, 'bind_param'), $aparams); $cstmt->execute(); $cstmt->bind_result($sfilename, $sdownloadpath, $sdescription, $simagefilepath, $skeywords); while($cstmt->fetch()) { if ($bringtone) { $sfilename = $_server['document_root'] . "/m4r/" . str_replace(".wav", ".m4r", $sfilename); if (file_exists($sfilename)) { $sdownloadpath = str_replace("sounds", "m4r", str_replace(".wav", ".m4r", $sdownloadpath)); $aresults[$sdownloadpath] = array($sdownloadpath, $sdescription, $simagefilepath, $skeywords, $asearchstrings); } } } $aresults = array_merge($aresults, $aqueryresults); $cstmt->close(); } } $cconnection->close(); $aresults = array_values($aresults); function in_arrayi($needle, $haystack) { return in_array(strtolower($needle), array_map('strtolower', $haystack)); } function keywordsort($a, $b) { if ($a[0] === $b[0]) return 0; $akeywords = explode(",", $a[3]); $bkeywords = explode(",", $b[3]); foreach ($a[4] $ssearchstring) { $afound = in_arrayi($ssearchstring, $akeywords); $bfound = in_arrayi($ssearchstring, $bkeywords); if ($afound && !$bfound) { return -1; } else if ($bfound && !$afound) { return 1; } } return 0; } usort($aresults, "keywordsort"); foreach ($aresults &$aresult) { unset($aresult[3]); unset($aresult[4]); } echo json_encode($aresults); ?>
sorting large quantities of data while having split field code-side slow. rather optimizing, i'd recommend way of doing it, such full-text indexing. it's quite neat once it's working.
if full-text isn't option, i'd recommend splitting keywords off separate table. way, can sort based on count after grouping. example ...
select d.*, count(k.id) keywordcount data d inner join keywords k on (d.id = k.dataid) k.value in ('keyword1', 'keyword2', 'keyword3') group d.id order keywordcount
on psish type note, can speed thing unioning selects, followed ordering, rather running them independently.
Comments
Post a Comment