Department of Computer Science
Permanent URI for this collectionhttp://197.255.125.131:4000/handle/123456789/23129
Browse
Item A cascading approach using se-resnext, resnet and feature pyramid network for kidney tumor segmentation(Heliyon, 2024) Appati, J. K.; Yirenkyi, I. A.Accurate segmentation of kidney tumors in CT images is very important in the diagnosis of kidney cancer. Automatic semantic segmentation of the kidney tumor has shown promising results to wards developing advance surgical planning techniques in the treatment of kidney tumor. However, the relatively small size of kidney tumor volume in comparison to the overall kidney volume, and its irregular distribution and shape makes it difficult to accurately segment the tu mors. In addressing this issue, we proposed a coarse to fine segmentation which leverages on transfer learning using SE-ResNeXt model for the initial segmentation and ResNet and Feature Pyramid Network for the final segmentation. The processes are related and the output of the initial results was used for the final training. We trained and evaluated our method on the KITS19 dataset and achieved a dice score of 0.7388 and Jaccard score 0.7321 for the final segmentation demonstrating promising results when compared to other approaches.Item An advance ensemble classification for object recognition(Neural Computing and Applications, 2021) Owusu, E.; Wiafe, I.The quest to improve performance accuracy and prediction speed in machine learning algorithms cannot be overemphasized, as the need for machines to outperform humans continue to grow. Accordingly, several studies have proposed methods to improve prediction performance and speed particularly for spatio-temporal analysis. This study proposes a novel classifier that leverages ensemble techniques to improve prediction performance and speed. The proposed classifier, Ada-AdaSVM uses an AdaBoost feature selection algorithm to select small features of input datasets for a joint support vector machine (SVM)–AdaBoost classifier. The proposition is evaluated against a selection of existing classifiers (SVM, AdaSVM and AdaBoost) using the Jaffe, Yale, Taiwanese facial expression database (TFEID) and CK + 48 datasets with Haar features as the preferred method for feature extraction. The findings indicated that Ada-AdaSVM outperforms SVM, AdaSVM and AdaBoost classifiers in terms of speed and accuracy.Item An advance ensemble classification for object recognition(Neural Computing and Applications, 2021) Owusu, E.; Wiafe, I.The quest to improve performance accuracy and prediction speed in machine learning algorithms cannot be overemphasized, as the need for machines to outperform humans continue to grow. Accordingly, several studies have proposed methods to improve prediction performance and speed particularly for spatio-temporal analysis. This study proposes a novel classifier that leverages ensemble techniques to improve prediction performance and speed. The proposed classifier, Ada-AdaSVM uses an AdaBoost feature selection algorithm to select small features of input datasets for a joint support vector machine (SVM)–AdaBoost classifier. The proposition is evaluated against a selection of existing classifiers (SVM, AdaSVM and AdaBoost) using the Jaffe, Yale, Taiwanese facial expression database (TFEID) and CK ? 48 datasets with Haar features as the preferred method for feature extraction. The findings indicated that Ada-AdaSVM outperforms SVM, AdaSVM and AdaBoost classifiers in terms of speed and accuracy.Item Analysis and Implementation of Optimization Techniques for Facial Recognition(Hindawi, 2021) Appati, J.K.; Abu, H.; Owusu, E.; Darkwah, K.Amidst the wide spectrum of recognition methods proposed, there is still the challenge of these algorithms not yielding optimal accuracy against illumination, pose, and facial expression. In recent years, considerable attention has been on the use of swarm intelligence methods to help resolve some of these persistent issues. In this study, the principal component analysis (PCA) method with the inherent property of dimensionality reduction was adopted for feature selection. +e resultant features were optimized using the particle swarm optimization (PSO) algorithm. For the purpose of performance comparison, the resultant features were also optimized with the genetic algorithm (GA) and the artificial bee colony (ABC). +e optimized features were used for the recognition using Euclidean distance (EUD), K-nearest neighbor (KNN), and the support vector machine (SVM) as classifiers. Experimental results of these hybrid models on the ORL dataset reveal an accuracy of 99.25% for PSO and KNN, followed by ABC with 93.72% and GA with 87.50%. On the central, an experimentation of the PSO, GA, and ABC on the YaleB dataset results in 100% accuracy demonstrating their efficiencies over the state-of-the art methods.Item Assessing the impact of persuasive features on user’s intention to continuous use: the case of academic social networking sites(Behaviour & Information Technology, 2020) Wiafe, I.; Kastriku, F.A.; Koranteng, F.N.; Gyamera, G.O.Social networking sites enable people to connect, communicate and share ideas. These sites have therefore become key for information sharing. Particularly, academics and researchers have adopted them for networking and collaboration. This study seeks to investigate how embedded persuasive features on social networking sites designed for academics and researchers affect continuous use intention. The study adopted an existing model for assessing the effectiveness of persuasive features on systems and sampled 416 participants who are engaged in academic research and analyzed their responses. The results indicate that Social Support, Computer-Human Dialogue Support and Primary Task Support significantly impact how users perceive social networking sites designed for effective academic work. Contrary to existing knowledge that Perceived Credibility, Perceived Effectiveness, Perceived Effort and Perceived Social Support all impacts an individual’s Intention to continuously Use of a system. only Perceived Credibility was observed to impact Intention to Use continuously. The findings also proved that affective ties and mutual support on academic social networking sites influence behaviour.Item Assessing the impact of persuasive features on user’s intention to continuous use: the case of academic social networking sites(Taylor & Francis Group, 2022) Wiafe, I.; Koranteng, F.N.; Kastrikua, F.A.; Gyamerac, G.O.Social networking sites enable people to connect, communicate and share ideas. These sites have therefore become key for information sharing. Particularly, academics and researchers have adopted them for networking and collaborations. This study seeks to investigate how embedded persuasive features on social networking sites designed for academics and researchers affect continuous use intention. The study adopted an existing model for assessing the effectiveness of persuasive features on systems and sampled 416 participants who are engaged in academic research and analyzed their responses. The results indicate that Social Support, Computer–Human Dialogue Support and Primary Task Support significantly impact how users perceive social networking sites designed for effective academic work. Contrary to existing knowledge that Perceived Credibility, Perceived Effectiveness, Perceived Effort and Perceived Social Support all impact an individual’s Intention to Continuous Use of a system, only Perceived Credibility was observed to impact Intention to Use continuously. The findings also proved that affective ties and mutual support on academic social networking sites influence behaviour.Item Automatic flotation froth bubble size distribution estimation using mean shift and watershed transforms(2015-06-23) Amankwah, A.Flotation is a process used in the mineral industry to concentrate valuable minerals. The efficiency of the process is influenced by among other the bubble size distribution of the flotation froth. In this investigation, we propose a new method for automatic estimation of the bubble size distribution using the mean shift algorithm and watershed transform. The mean shift algorithm is used to find pixel clusters of particular modes of the probability density function of the image data. Pixel clusters of which the modes are above a given threshold are used as markers for the watershed transform. Experimental results show that the proposed method was fast and robust for determining the size distribution of different classes of flotation froth.Item An automatic software vulnerability classification framework using term frequency-inverse gravity moment and feature selection(Journal of Systems and Software, 2020-05-15) Mensah, S.; Chen, J.; Kudjo, P.K.; Brown, S.A.; Akorfu, G.Vulnerability classification is an important activity in software development and software quality main- tenance. A typical vulnerability classification model usually involves a stage of term selection, in which the relevant terms are identified via feature selection. It also involves a stage of term-weighting, in which the document weights for the selected terms are computed, and a stage for classifier learning. Generally, the term frequency-inverse document frequency (TF-IDF) model is the most widely used term-weighting metric for vulnerability classification. However, several issues hinder the effectiveness of the TF-IDF model for document classification. To address this problem, we propose and evaluate a general framework for vulnerability severity classification using the term frequency-inverse gravity moment (TF-IGM). Specifi- cally, we extensively compare the term frequency-inverse gravity moment, term frequency-inverse doc- ument frequency, and information gain feature selection using five machine learning algorithms on ten vulnerable software applications containing a total number of 27,248 security vulnerabilities . The exper- imental result shows that: (i) the TF-IGM model is a promising term weighting metric for vulnerability classification compared to the classical term-weighting metric, (ii) the effectiveness of feature selection on vulnerability classification varies significantly across the studied datasets and (iii) feature selection improves vulnerability classification.Item Basic computer systems, architecture and applications(Intimily Graphics Limited, 271p, 2010) Gyebi, E.B.B.Item Brain tumor diagnosis based on artificial neural network and a chaos whale optimization algorithm(Computational Intelligence, 2019-11-20) Abza, F.; Gong, S.; Gao, W.Accurate and early detection of the brain tumor region has a great impact on the choice of treatment, its success rate, and the follow-up of the disease process over time. This study presents a new bioinspired technique for the early detection of the brain tumor area to improve the chance of completely healing. The study presents a multistep technique to detect the brain tumor area. Herein, after image preprocessing and image feature extraction, an artificial neural network is used to determine the tumor area in the image. The method is based on using an improved version of the whale optimization algorithm for optimal selection of the features and optimizing the artificial neural networkweights for classification. Simulation results of the proposed method are applied to FLAIR, T1, and T2 datasets and are compared with different algorithms. Three performance indexes including correct detection rate, false acceptance rate, and false rejection rate are selected for the system performance analysis. Final results showed the superiority of the proposed method toward the other similar methods.Item A Classical LTE Cellular System Simulator for Computer Network Education(Hindawi, 2021) Ludu, J.Y.; Appati, J.K.; Owusu, E.; Boakye-Sekyerehene, P.The proposal of LTE in the standardization of cellular network systems has received considerable attention in the research domain, and most subscribers widely use it. Despite the enormous acceptance of the system, academia as an industry is usually disadvantaged in training students due to the cost implication in setting up a prototype. In bridging this gap, simulators are traditionally developed as a testbed to aid students appreciate how these systems work. Although there are several simulators available on the market, these simulators are quite expensive to acquire while others come with license restrictions. In this study, a classical LTE cellular system simulator is proposed as a testbed to aid the education of computer networks at college. The proposed simulator is an extension of the functionality of LTE-Sim frameworks. Usability testing of the proposed study reveals that the system is much easier to simulate the various scenarios in wireless communication.Item A conceptual model and empirical assessment of HR security risk management(Emerald Publishing Limited, 2019-07-08) Yaokumah, W.; Kumah, P.; Okai, E.S.AThis study develops a conceptual model and assesses the extent to which pre-employment, during employment, and post-employment HR security controls are applied in organizations to manage information security risks. The conceptual model is developed based on the Agency Theory and the review of theoretical, empirical and practitioner literature. Following, an empirical data is collected through a survey from one hundred and thirty-four IT professionals, internal audit personnel, and HR managers working within five major industry sectors in a developing country to test the organizational differences in pre-employment, during employment, and post-employment HR security measures. Using analysis of variance, the findings reveal significant differences among the organizations. Financial institutions perform better in employee background checks, terms and conditions of employment, management responsibilities, security education, training and awareness, and disciplinary process. Conversely, healthcare institutions outperform other organizations in post-employment security management. The government public institutions perform the worst among all the organizations. An integration of a conceptual model with HR security controls is an area that is under-researched and under-reported in information security and human resource management literature. Accordingly, this research on HR security management contributes to reducing such a gap and adds to the existing HR security risk management literature. It thereby provides an opportunity for researchers to conduct comparative studies between developed and developing nations or to benchmark a specific organization’s HR security management.Item Content-based Image Retrieval using Tesseract OCR Engine and Levenshtein Algorithm(IJACSA, 2021) Adjetey, C.; Adu-Manu, K.S.—Image Retrieval Systems (IRSs) are applications that allow one to retrieve images saved at any location on a network. Most IRSs make use of reverse lookup to find images stored on the network based on image properties such as size, filename, title, color, texture, shape, and description. This paper provides a technique for obtaining full image document given that the user has some portions of the document under search. To demonstrate the reliability of the proposed technique, we designed a system to implement the algorithm. A combination of Optical Character Recognition (OCR) engine and an improved text matching algorithm was used in the system implementation. The Tesseract OCR engine and Levenshtein Algorithm was integrated to perform the image search. The extracted text is compared to the text stored in the database. For example, a query result is returned when a significant ratio of 0.15 and above is obtained. The results showed a 100% successful retrieval of the appropriate file base on the match even when partial query images were submitted.Item Design and Implementation of an Integrated Web Application for the Motor Traffic and Transport Directorate of the Ghana Police Service(Science and Development, 2018) Abdulai, J.A major challenge facing most organisations is how to share data and services in a timely and cost effective manner to simplify business processes. Integrating new application modules or devices with an existing system smoothly and without any discernible errors or complications is a major issue. A framework for creating rapidly loosely coupled service applications components, the Service Oriented Architecture (SOA), which meets time and cost constraints, has been proposed. In this paper, we report on the adoption of an iterative approach to implement an integrated web based information system for the Motor Transport and Traffic Directorate (MTTD) of the Ghana Police Service. The system is developed as a set of independent web applications sharing a database to provide a single and easy point of information access. In each iteration, a composite of the system is designed and tested, thus meeting the project design objectives.Item Development and validation of an improved DeLone-McLean IS success model - application to the evaluation of a tax administration ERP(International Journal of Accounting Information Systems, 2022) Akrong, G.B.; Owusu, E.; Yunfei, S.Enterprise resource planning (ERP) is critical to an organization’s success. However, the factors that contribute to the success and usage of these ERP systems have received little attention. This study developed and validated an improved DeLone-McLean IS success model. Additionally, we examined the factors which influence ERP system usage, employee satisfaction, and information quality, service quality, and system quality, as well as the factors that influence the system’s overall success. The proposed model is based on a mixed-methods case study (MM-CS). The results show that the proposed model significantly measures the success of an ERP system. The organizational climate, the information quality, the system quality, and the service quality all have an impact on the usage of an ERP system. The proposed model also shows that the use of an ERP system, training and learning, and the three information (IS) quality constructs are all significant predictors of user satisfaction. The results also indicate that gender and years of ICT use on the path of ERP users have a moderating effect on the relationship between teamwork & support and use.Item The effect of Bellwether analysis on software vulnerability severity prediction models(Software Quality Journal, 2020-01-07) Mensah, S.; Kudjo, P.K.; Chen, J.; Amankwah, R.; Kudjo, C.Vulnerability severity prediction (VSP) models provide useful insight for vulnerability prioritization and software maintenance. Previous studies have proposed a variety of machine learning algorithms as an important paradigm for VSP. However, to the best of our knowledge, there are no other existing research studies focusing on investigating how a subset of features can be used to improve VSP. To address this deficiency, this paper presents a general framework for VSP using the Bellwether analysis (i.e., exemplary data). First, we apply the natural language processing techniques to the textual descriptions of software vulnerability. Next, we developed an algorithm termed Bellvul to identify and select an exemplary subset of data (referred to as Bellwether) to be considered as the training set to yield improved prediction accuracy against the growing portfolio, within-project cases, and the k-fold cross-validation subset. Finally, we assessed the performance of four machine learning algorithms, namely, deep neural network, logistic regression, k-nearest neighbor, and random forest using the sampled instances. The prediction results of the suggested models and the benchmark techniques were assessed based on the standard classification evaluation metrics such as precision, recall, and F-measure. The experimental result shows that the Bellwether approach achieves F-measure ranging from 14.3% to 97.8%, which is an improvement over the benchmark techniques. In conclusion, the proposed approach is a promising research direction for assisting software engineers when seeking to predict instances of vulnerability records that demand much attention prior to software release.Item El Niño-Southern Oscillation forecasting using complex networks analysis of LSTM neural networks(Artificial Life and Robotics, 2019-06-04) Broni-Bedaiko, C.; Katsriku, F.A.; Unemi, T.; Atsumi, M.; Abdulai, J-D.; Shinomiya, N.; Owusu, E.Arguably, El Niño-Southern Oscillation (ENSO) is the most influential climatological phenomenon that has been intensively researched during the past years. Currently, the scientific community knows much about the underlying processes of ENSO phenomenon, however, its predictability for longer horizons, which is very important for human society and the natural environment is still a challenge in the scientific community. Here we show an approach based on using various complex networks metrics extracted from climate networks with long short-term memory neural network to forecast ENSO phenomenon. The results suggest that the 12-network metrics extracted as predictors have predictive power and the potential for forecasting ENSO phenomenon longer multiple steps ahead.Item EMIS Success Modeling Using Information Systems Quality Factors.(IGI Global, 2021) Danso, L.A.; Adjei, J.K.; Yaokumah, W.This study validates the DeLone and McLean information systems (IS) success model in the context of education management information system (EMIS). It develops a model to examine the effect of IS quality factors (system quality, information quality, and service quality) on IS success (use, user satisfaction, and net benefit) of EMIS. The study employs purposive sampling technique to select participants and a validated structural questionnaire to collect data from 100 users of EMIS. Employing three multiple regression models, the results show that there is a statistical significant relationship between system quality, information quality, service quality, use, and user satisfaction. Overall, among the six constructs measured, the impact of system quality, information quality, and use significantly improve the net benefits of EMIS. However, service quality contributes insignificantly to user satisfaction.Item Empirical exploration of whale optimisation algorithm for heart disease prediction(Scientific Reports, 2024) Atimbire, S.A.; Appati, J.A.; Owusu, E.Heart Diseases have the highest mortality worldwide, necessitating precise predictive models for early risk assessment. Much existing research has focused on improving model accuracy with single datasets, often neglecting the need for comprehensive evaluation metrics and utilization of different datasets in the same domain (heart disease). This research introduces a heart disease risk prediction approach by harnessing the whale optimization algorithm (WOA) for feature selection and implementing a comprehensive evaluation framework. The study leverages five distinct datasets, including the combined dataset comprising the Cleveland, Long Beach VA, Switzerland, and Hungarian heart disease datasets. The others are the Z-AlizadehSani, Framingham, South African, and Cleveland heart datasets. The WOA-guided feature selection identifies optimal features, subsequently integrated into ten classification models. Comprehensive model evaluation reveals significant improvements across critical performance metrics, including accuracy, precision, recall, F1 score, and the area under the receiver operating characteristic curve. These enhancements consistently outperform state-of-the-art methods using the same dataset, validating the effectiveness of our methodology. The comprehensive evaluation framework provides a robust assessment of the model’s adaptability, underscoring the WOA’s effectiveness in identifying optimal features in multiple datasets in the same domain. Heart Disease (HD) is of utmost importance due to the heart’s critical role among other human organs. HD has high death rates worldwide, with approximately 17.9 million people dying from heart conditions in 20191. Heart diseases account for 32% of global deaths, with heart attacks and stroke alone making more than 85% of recorded deaths. Over 75% of cardiovascular deaths in 2019 occurred in underdeveloped nations, accounting for 38% of deaths under 70 years1. Since cardiovascular diseases are fatal, their early detection will enable medical professionals to provide timely healthcare to patients to avert death. Because of a scarcity of ultra-modern examination tools and medical experts, conventional medical methods for diagnosing heart diseases are challenging, complicated, time-consuming, and exorbitant, making the diagnosis of heart diseases difficult and sometimes unavailable, especially in developing countries2. Machine and deep learning methods have been recently used to analyze clinical data and make predictions3. Machine learning (ML) provides cost-efficient alternatives where already collected patient data serve as a data mine to perform predictive analysis for diagnostic purposes. To improve the accuracy of ML models, some existing works have focused on using various classifiers or their enhanced forms4– 7. Related works confirm that the feature selection reduces data dimensionality and improves model performance significantly8. Hence, some studies have utilized various methods to improve performance by varying the feature selection methods9,10. However, some works that utilize feature selection are fraught with redundant features that impact metrics recorded. This is affirmed when wrapper methods are used over filter methods and when embedded methods are used over filter and wrapper methods. It also explains why works, including feature selection, may only record better performance on some datasets if the technique is efficient. In addition, though the researchers do not present the reason some existing works have not reported on specific metrics, studies such as Hicks et al.11 have posited that in a clinical setting, a subset of metrics may give an erroneous outlook of how a model performs and not enabling holistic model performance evaluation. There is an avenue for more scientific work on feature selection methods capable of improving other metrics besides the accuracy metric. This helps to affirm the reliability of the model performance as the unavailability of multiple evaluation metrics is an indication of an unbalanced model not capable of being thoroughly assessed. This study proposes the use of the whale optimization algorithm (WOA) as a swarm-inspired feature selection algorithm on five (5) heart datasets on ten (10) models (classical ML, ensemble and deep learning models) for the selection of relevant datasets features. The approach contributes to the body of knowledge in the heart disease domain by providing a comprehensive assessment of five different datasets (in the same domain), ten different models and five evaluation metrics. The proposed methodology also validates the robustness of the WOA algorithm on five datasets of variable sizes in the same domain compared to most works, which do not test their methodologies on multiple datasets in the same domain.Item An Empirical Study of the Relationship Between Social Networking Sites and Students’ Engagement in Higher Education(Journal of Educational Computing Research, 2018-07) Koranteng, F.N.; Wiafe, I.; Kuada, E.This article investigates how students’ online social networking relationships affect knowledge sharing and how the intensity of knowledge sharing enhances students’ engagement. It adopts the social capital theory as the basis for investigation, and the partial least square structural equation modeling was used to examine the hypothesized model. Responses from 586 students in higher education were analyzed. The findings provided empirical evidence which contradicts the argument that students perceive social networking sites as an effective tool for learning. Also, contrary to previous studies which posit that knowledge sharing impacts engagement, it was observed that there is no relationship between the two. However, as social networking sites differ in terms of member behavior norms, it is envisaged that if a similar study is conducted and limited to a specific academically inclined social networking site such as Academia.edu, ResearchGate, Mendeley, and so on, different findings may be observed.