Using a fine KNN classifier and a medium KNN classifier, analyze and compare diabetic prediction
Main Article Content
Abstract
Aim: The goal of the study is to find out the presence of diabetes using the Fine K-NN (K- Nearest Neighbour) and Medium K-NN (K- Nearest Neighbour) algorithm and comparing the accuracy, specificity and sensitivity. Materials and Methods: A compilation of information from Kaggle’s website was used in this research. The samples were regarded as (N=25) for Fine KNN and (N= 25) Medium KNN according to clinicalc.com, total sample size calculation was performed by keeping alpha error-threshold value 0.05, enrollment ratio is 0:1, 95% confidence level and power is 80%. The accuracy, specificity, sensitivity was calculated by using Matlab programming software. Results: The accuracy (%), specificity (%) and sensitivity (%) is compared using SPSS software using independent sample t tests. There is a statistically insignificant difference, P=0.832, P>0.05 with accuracy (60.4%), P=0.002, P<0.05 with specificity (62.18%) and P=<0.001, p<0.05 with sensitivity (59.18%) and demonstrated a better outcome in comparison to Medium KNN accuracy (47.2%), specificity (48.3%) and sensitivity (34.5%). Conclusion: Fine KNN appears to give better accuracy, specificity and sensitivity than Medium KNN to predict diabetes.