Research / Researchers / Publications
Permanent URI for this community
Browse
Browsing Research / Researchers / Publications by Title
Now showing 1 - 20 of 3668
Results Per Page
Sort Options
- Item11-day cycle of stock prices in Kenya around profit warnings(Strathmore University, 2020) Kagiri, Jonathan NjengaA profit warning is a statement issued by a company in order to inform the public that the profits for a specified period will be significantly different from the expected profit levels. The Capital Markets Authority, which is responsible for the regulation of the stock exchange, in a bid to reduce the levels of information asymnetry and conflicts of interest between managers and shareholders, made it a requirement for all companies listed on the Nairobi Stock Exchange to issue profit warnings if their profit will be 25% less than what was expected. This study aims to view the abnormal returns surrounding a profit warning on the returns within a 1 0-day scope of the release of a profit warning. The theories and hypotheses this study relies on are the agency theory, the efficient market hypothesis and the signalling theory. An event study methodology was used, with abnormal returns being derived as a regression analysis result of the stock versus the market returns. The result being that the abnormal return is significantly different on the trading day after the profit warning and two days after the profit warning.
- Item60 @ 60: Development of the Nairobi Securities Exchange(Strathmore University, 2014) Waweru, FreshiaThe Nairobi Securities Exchange (NSE) was established in 1954 and recently celebrated its 60th anniversary. However, the number of listed companies over this period have been minimal – currently, there are 63 listed companies but four has been suspended from trading. This study therefore sought to investigate the specific factors influencing company listings at the NSE. The study sought to establish: first, the factors that influences listing decision among the listed companies; and secondly; to establish why some companies, which have met the listing requirements threshold have not opted to publicly list despite the numerous efforts by the exchange. For the first objective, a regression analysis was carried out to determine which factors influences listing decision. The factors analyzed included; stock market liquidity, stock market volatility, the legal and regulatory framework, and political environment. The industry, market automation and taxation were used as control variables. The model was significant at 5% lever with an adjusted R squared of 68.8%. Political environment was the most significant variable followed by stock market liquidity and then stock market variability. The industry into which a company belongs to as well as the market automation were found to be insignificant at 5% significant levels. The second objective used questionnaires to establish why the non listed companies which have met the listing requirements were not yet listed. Most non-listed considered the legal and regulatory framework as too stringent and hence the leading hindrances to listing. The companies also considered the listing and maintenance costs as too high. In addition, most companies did not want the public scrutiny that accompanies a listed company. Other companies were family owned and wanted the status quo while others did not want dilution of ownership. Most of the non listed companies considered access to wide capital base as the leading reason why they could consider listing. To increase the number of listings therefore, the NSE as well as the Capital Markets Authority (CMA) should ensure that the market is liquid. This would ensure that the companies are able to access capital easily. The ongoing efforts to widen the number of products available should continue to attract more investors. Also, the rules and regulations should be reviewed to make sure they are not too stringent. There is need to review the listing costs which were considered as too high. The NSE should constantly communicate with the prospective companies the measures they are taking to encourage listings. The government should also ensure a stable political environment.
- Item60 @ 60: Development of the Nairobi Securities ExchangeWaweru, Freshia MugoThe Nairobi Securities Exchange (NSE) was established in 1954 and recently celebrated its 60th anniversary. However, the number of listed companies over th is period have been minimal - currently, there are 63 listed companies but four has been suspended from trading. This study therefore sought to investigate the specific factors influencing company listings at the NSE. The study sought to establish: first, the factors that influences listing decision among the listed companies; and secondly; to establish why some companies, which have met the listing requirements threshold have not opted to publicly list despite the numerous efforts by the exchange. For the first objective, a regression analysis was carried out to determine which factors influences listing decision. The factors analyzed included; stock market liquidity, stock market volatility, the legal and regulatory framework, and political environment. The industry, market . automation and taxation were used as control variables. The model was significant at 5% lever with an adjusted R squared of 68.8%. Political environment was the most significant variable followed by stock market liquidity and then stock market variability. The industry into which a company belongs to as well as the market automation were found to be insignificant at 5% significant levels. The second objective used questionnaires to establish why the non listed cornparues which have met the listing requirements were not yet listed. Most non-listed considered the legal and regulatory framework as too stringent and hence the leading hindrances to listing. The companies also considered the listing and maintenance costs as too high. In addition, most companies did not want the public scrutiny that accompanies a listed company. Other companies were family owned and wanted the status quo while others did not want dilution of ownership. Most of the non listed companies considered access to wide capital base as the leading reason why they could consider listing.
- ItemA Bayesian hierarchical model for correlation in microarray studiesOmolo, BernardMicroarrays are miniaturised biological devices consisting of molecules (e.g. DNA or protein), called \probes", that are orderly arranged at a microscopic scale onto a solid support such as a nylon membrane or a glass slide.The array elements (probes) bind speci cally to labeled molecules, called "targets", into complex molecular mixtures,thereby generating signals that reveal the identity and the concentration of the interacting labeled cells.Microarray analysis has a broad range of applications that involve di erent types of probes and/or targets (cDNA or oligos)
- ItemA Bi-Lingual counselling chatbot application for support of Gender Based Violence victims in Kenya(Strathmore University, 2024) Mutinda, S. W.Gender-based violence (GBV) remains one of the highest prevailing human rights violations globally, surpassing national, social, and economic boundaries. However, due to its nature, it is masked within a culture of silence and causes detrimental effects on the dignity, health, autonomy, and security of its victims. The prevalence of GBV is fuelled by cultural nuances and beliefs that justify and promote its acceptability. The stigma surrounding GBV in addition to fear of the consequences of disclosure deter victims from seeking help. Additionally, the resources available for addressing GBV such as legal frameworks and recovery centres are limited. Technological approaches have been established to tackle GBV as intermediate and supplementary support for victims as part of UN-SDG 5. Conversational Agents such as Chomi, ChatPal, and Namubot have been developed for counselling of GBV victims who struggle with disclosing their predicament to humans. The existing chatbots, however, are not a fit for Kenyan victims because they utilize languages such as Swedish, Finnish, Isizulu, Setswana and Isixhosa in addition to incorporating referral services specific to their regions. This research addressed this gap by developing a chatbot application suitable for the Kenyan region for counselling of GBV victims using both Kiswahili and English, the languages predominantly used in the country, in addition to including contacts to referral services within the country. The methodology utilized involved the development of a chatbot application based on Rasa open source AI framework by training a model using a pre-processed counselling dataset. The performance of the model was evaluated using NLU confidence score to determine the model’s certainty in its intent identification and a confusion matrix was generated which with 80% and 20% training and testing data split resulted in 100% classification threshold accuracy. Python’s Fuzzy Matching Token Set Ratio score was also used to determine the response which best matches the input with results indicating satisfactory performance of the model ranging between 63% and 92% for GBV queries input. The developed model was then integrated into a web application as the user interface for user access and interaction with the model hence achieving the research objective of developing a chatbot application to conduct counselling for GBV victims in Kenya using English and Kiswahili languages . Keywords: Gender-based Violence, stigma, chatbot, Rasa open source, NLU Confidence Score, Fuzzy Matching Token Set Ratio score
- ItemA Blockchain tool to detect and mitigate e-book piracy: a case study of Kenya(Strathmore University, 2023) Nzangi, J. M.Numerous online e-book markets have emerged along with the growth of e-book readers. This has also increased the speed and ease with which people share books. As a result, piracy has been skyrocketing since there is no security for books being shared, allowing only one person to have one copy of the purchased books. Globally, e-book piracy has been a significant setback for publishers, as the available solutions cannot offer the necessary content protection. A good example is the strict Digital Rights Management (DRM) that occasionally annoys real readers by preventing them from accessing their books or forcing them to forfeit ownership if the platform is shut down. Publishers, online platform providers, and writers currently comprise the e-book market. E-book piracy has real-world consequences that affect both publishers' and authors' bottom lines and their ability to produce more books. This work developed a non-fungible-token-based e- book platform that enables writers to self-publish e-books and sell them without the risk of piracy. NFTs, or non-fungible tokens, are digital assets that stand in for real-world things like artwork, collectibles, and game assets. New Financial Instruments (NFTs) use blockchain and smart contracts as their underlying digital infrastructure. When published, each book will have a separate non-fungible token (NFT) attached to it. The study used a trusted and secure e-book transaction system that meets the following security requirements: license verification for each e-book, content confidentiality, right to read authorization, authenticating a genuine buyer, confirming the validity and integrity of e-book contents, direct purchase safety, and preventing e-book piracy and illegal downloading. The developed solution will be a lifesaver for the e-book industry in Kenya and other regions worldwide since they offer an easy way for readers and authors to easily make secure e-book transactions with zero risk of piracy or denial of access for legitimate access users. Keywords: Blockchain, E-book, Non-Fungible Tokens, Piracy, Smart Contracts.
- ItemA Blockchain-based prototype for cybersecurity threat intelligence sharing: a case of Kenyan banking and insurance financial institutions(Strathmore University, 2021) Kibuci, Wanjohi StephenCybersecurity threats to financial institutions have become more sophisticated and challenging to deal with. The growing dependence of financial institutions on cyberspace makes cybersecurity preparedness against threats important to achieve a financial institution's mission and vision. In this context, cybersecurity preparedness is the process in which a financial institution can protect against, prevent, mitigate, respond to, and recover from cyber threats. Traditionally, most organizations share threat intelligence through ad hoc methods such as emails and phone calls but there is a need to automate threat intelligence sharing where possible to improve cybersecurity preparedness. To address this issue, and enhance cybersecurity and trust, a blockchain-based approach can be employed to share threat intelligence. This study aims to leverage blockchain technology by developing a prototype to automate cybersecurity threat intelligence sharing in financial institutions. The study used a quantitative approach in data collection using structured online questionnaires with close-ended questions and open source datasets and data analysis using several analytic tools. The prototype has been developed using the Rapid Application Development software development methodology using open-source Oracle Virtual Box that runs on Linux Operating System
- PublicationA Case Study on Microfinance Miriam WambuiOpiyo, Cavin OtienoIn mid May 2007, Miriam Wambui, recently appointed as the first manager of a newly established unit office of the Kenya Women Finance Trust (KWFT) at Loitokitok was wondering how she could meet her loans disbursement and recovery targets when KWFTs 2006 policy restricted her from disbursing loans to women who were most in need of them.
- ItemA Comparative study of Hybrid Neural Network and ARIMA Models with application to forecasting intra-day child-line calls in Kenya(Strathmore University, 2022) Wang’ombe, Grace WairimuBackground: For successful staffing and recruiting of call centre professionals, precise forecasting of the number of calls arriving at the centre is crucial. These projections are needed for various periods, both short and long-term. Benchmark time series models such as ARIMA and Holt-Winters used in forecasting call centre data are outperformed in long term forecasts, especially when the data is not stationary. Advanced models such as the ANNs can pick up on the random peaks or outlying periods better than the benchmark time–series models. The hybrid methodology combines the strengths of the benchmark time–series and advanced models, thus improving overall forecasts. Objective: The study’s primary goal was to assess the superiority of a Hybrid ARIMAANN model over its constituent models in forecasting Childline call centre data in Kenya. Methods: The ARIMA, ANN and hybrid ARIMA-ANN models were used in the call centre data forecasting. The cross-validation technique was used to create forecasting accuracy metrics which are then compared. In ARIMA, the Box-Jenkins methodology is used to fit the model whereas the neural network element of the hybrid model and the ANN were modelled using the feed-forward Neural Network Autoregressive(NNAR) structure. Results: The Seasonal ARIMA - ANN model outperformed the ARIMA model in short term forecasts and the ANN model in long term forecasts. The Diebold-Mariano test indicated a significant difference between the hybrid and ANN forecasts, whereas the difference between the hybrid and ARIMA forecasts was not significant. Conclusion: The Hybrid model was able to adapt both of its constituent models’ advantages to better its performance. These results are helpful as call centres can be able to use one model which is robust enough to create accurate forecasts rather than the benchmark models.
- ItemA copula-based approach to differential gene expression analysisChaba, Linda Akoth; Odhiambo, John W.; Omolo, BernardMelanoma is a major public health concern in the developed world. Melanoma research has been enhanced by the introduction of microarray technology, whose main aim is to identify genes that are associated with outcomes of interest in melanoma biology and disease progression. Many statistical methods have been proposed for gene selection but so far none of them is regarded as the standard method. In addition, none of the proposed methods have applied copulas to identify genes that are associated with quantitative traits. In this study, we developed a copula-based approach to identify genes that are associated with quantitative traits in the systems biology of melanoma. To assess the statistical properties of model , we evaluated the power, the false-rejection rate and the true-rejection rate using simulated gene expression data . The model was then applied to a melanoma dataset for validation. Comparison of the copula approach with the Bayesian and other parametric approaches was performed, based on the false discovery rate (FOR) , the value of R-square and prognostic properties. It turned out that the copula model was more robust and better than the others in the selection of genes that were biologically and clinically significant.
- ItemA Credit scoring model for mobile lending(Strathmore University, 2024) Oindi, B.An exponential increase in mobile usage has led to more accessible access to mobile loans for most Kenyans; this has created a lifeline for those excluded by traditional financial institutions; the easier way to borrow loans comes with its risks. The major one is borrower defaulting. This creates a need for credit scoring, which plays a crucial role in decision-making for lenders to determine borrowers’ creditworthiness, therefore minimizing credit risk and managing information asymmetry. On mobile lending, borrowers’ financial information is usually limited, making machine learning a favorable tool for credit assessment. Traditionally, the process required statistical algorithms and human assessment, which fall short when subjected to large datasets and are time-consuming. The traditional methods also need help adjusting to changes in borrowers' behavioral needs. Against this backdrop, this research developed a novel credit scoring model for mobile lending using Random Forest, XGBoost, LightGBM, Catboost, and AdaBoost algorithms. SMOTE was used to address the class imbalance problem. The model achieved the best accuracy of 86%. The research further analyzes the challenges in credit scoring and reviews related works by several authors. The research also looked at the feature importance of the models, which effectively analyzed the model's behavior. This model can analyze vast volumes of data, which would otherwise be resource-intensive if done manually. The machine learning model was then deployed into a Streamlit Web Application with a user interface where real-time predictions are made based on borrower data. The model can give lenders insights into determining borrowers' creditworthiness and enable them to make informed decisions before lending. Keywords: Mobile loans. Credit Scoring. Probability of Default. Machine Learning. Statistical Algorithms. SMOTE
- ItemA cross-sectional analysis of the factors influencing company listings on the Nairobi Securities ExchangeKiboi , Teresa Wanjiku; Waweru, Freshia Mugo (Dr.)This was a cross-sectional study of the specific factors influencing company listings based on the Nairobi Securities Exchange (NSE). The study sought to establish what factors affect those companies which have met the threshold listing requirements but have not opted to publicly list on the exchange. Non listed companies were used as suggested by prior research to determine what has hindered their being listed as well as what would motivate them to consider listing on the stock market with regard to the benefits that accrue to listing. Data was collected based on two sample groups of companies: listed and non-listed using the companies’ prospectuses of the listed companies and a questionnaire for the non-listed companies. Basic descriptive statistics were used to describe the empirical data, inferential statistics and multiple regression analyses were used for analysis. From among the listed companies the most influential factor considered in the listing decision was the political environment which was characterised by a change in political regime. The effect cited by the respondents was the (de) regulation of the industries in which the companies were operating in thus making expansion possible and consequently use of the capital market to raise funds. Additional factors which had not been considered in the literature which emerged among these companies were the market automation which considered to have made the market more efficient and thus more attractive. With reference to the non-listed companies, the most influential factor was the listing requirements considered under the legal and regulatory framework. The respondents expressed the view that these were too stringent. The other relatively more influential factor was the political environment which was also highly considered by the respondents. However, there were four issues that emerged that had been previously covered scantily. These factors were determined as the more influential factors by the respondents with reference to their not being listed. The emerging issues were company or organization structure, public scrutiny, dilution of ownership and a lack of necessity to raise long term funds. Ironically, the most motivating benefit was access to a wide capital base, drawing the conclusion that when a company is in need of heavy capital financing they would highly consider use of the capital market. Despite these benefits the study found that there is a need to lower listing and maintenance costs and for the NSE to broaden the scope of their products.
- ItemA Customer churn prediction and corrective action suggestion model for the telecommunications industry using predictive analytics(Strathmore University, 2024) Wanda, R. K.The telecommunications industry is significantly susceptible to customer churn. Customer churn leads to loss of customer base which leads to reduction in revenue, reduced profit margins, increased customer acquisition costs and loss of brand value. Mitigating the effects of customer churn has proved to be a tall order for many organizations in the telecommunications industry. Most companies employ a reactive approach to customer churn and thus do not take any corrective actions until the customer has left. This approach does not enable organizations to know and prevent potential churn before it occurs. Alternatively, some organizations employ a more proactive approach to mitigate customer churn through predictive analytics. Although this approach is more effective, it only predicts which customers will churn without recommending the appropriate corrective action. In this dissertation, a customer churn prediction and corrective action suggestion model using predictive analytics was implemented to predict churn and suggest appropriate corrective actions. The IBM telco customer churn dataset accessed via API from the open machine learning.org website was used for this study. The dataset was subjected to pre-processing and exploratory data analysis to gain valuable insights into the data. To enhance the reliability of the developed model, an 80/20 train/test split was applied to the dataset. The training dataset was then divided into 5 folds before model fitting. Several classification algorithms; Logistic Regression, Gaussian Naive Bayes, Complement Naïve Bayes, K-NN, Random Forest and CatBoost were then fit with the training data and their performance was evaluated. Logistic Regression achieved a recall of 80% and was selected for system implementation. Logistic regression feature coefficients were then used to determine the appropriate corrective actions. A locally hosted web interface was then developed using the Python Streamlit library to enable users to feed input into the model and get churn predictions and corrective action suggestions. The developed model demonstrated ease of use and high performance and will enable telecommunication companies to accurately predict customer attrition and take appropriate corrective actions, reducing customer attrition's impact on the companies’ bottom line. Keywords: churn, machine learning, predictive analytics, telecommunications industry
- ItemA Descriptive study on nutrition knowledge and dietary practices among adults with Type 2 Diabetes Mellitus and Hypertension at Kitale County Referral Hospital(Strathmore University, 2023) Kiarie, R. W.Non-communicable diseases (NCDs) are the leading global cause of death, with most of these deaths occurring in low to middle- income countries (LMICs). Hypertension and diabetes are two of the four major NCDs, and they are often comorbidities, meaning that they occur at the same time. Morbidity and mortality is usually a result of long-term complications, and apart from medical therapy, these can be prevented by lifestyle interventions that include dietary modifications. This study sought to describe the nutrition knowledge and dietary practices of patients with type 2 diabetes and hypertension. The focus was on patients receiving care at the Kitale County Referral Hospital in Trans Nzoia County, and the study objectives were to (i) assess patients’ knowledge of dietary influence on diabetes and hypertension, (ii) assess sociocultural influences on patients’ dietary practices, (iii) assess patients’ willingness to change their dietary practices, and (iv) assess patient’s awareness of their dietary practices. The study was supported by the Social Cognitive Theory, which is premised upon the reciprocal interaction between individual, behavioural and environmental factors. These factors interact to formulate the control that an individual has over their illness, thereby influencing their motivation to perform self-care activities. This descriptive cross-sectional study utilized quantitative techniques by use of structured questionnaires as the main data collection instrument, in a target population of 973 and a sample size of 283 respondents. Data analysis was carried out using SPSS software, quantitative techniques were used to analyze the data, and descriptive techniques were applied to analyse the characteristics of the respondents. The following conclusions were made from the results: that majority of the participants understood the role of diet in the management of these two conditions; that some cultural practices posed a challenge to some participants, and that they had the family, spousal and social support they needed; that participants were willing to change their dietary practices and adhere to the recommended diet regimens; that most participants had received adequate nutrition education and counselling, however eating balanced diets was a challenge, they were not able to find all the foods they had been advised to eat, and they had to think about the cost of buying these foods. The study recommends sustained efforts in patient education, inclusion of experiential learning through the use of a hospital kitchen in order to contextualize use of locally available foods, and strategies to combat food insecurity especially among the ageing population in the county.
- ItemA Dynamic parallel algorithm for derivatives pricing and hedging on GPU-CPU heterogeneous systems(Strathmore University, 2023) Muganda, B. W.The use of artificial intelligence in the financial services industry has the potential to transform the sector through greater efficiencies, cost reductions and better tools to draw intelligence from large datasets. The access to computing power which is scalable, accurate and reliable has consequently become a major requirement for the industry due to increased competition, increased products and complexity in models, increased volume of data, stricter regulatory environment and desire for competitive advantage. In this regard, this research provides methodological solutions that would result in accurate and fast system throughput, cost saving and speed acceleration for a financial institution’s financial engineering system by adoption of heterogeneous CPU-GPU parallel architecture with algorithms which are freshly created by drafting from dynamic copula framework for option pricing. This price estimation of options and the assessment of their risk sensitivities under stochastic dynamics namely: stochastic interest rate, stochastic volatility and jumps for varying strikes, maturity and asset classes is a computationally intensive task given the complex nature of the pricing methodologies applied. Models that are much more fully analytical and less complex for pricing derivatives under stochastic dynamics are desirable for much more accurate pricing, investment portfolio construction and risk analysis; and with it an associated system prototype that would provide real-time results. This thesis formulated dynamic parallel algorithms for derivative security pricing and hedging on GPU-CPU heterogeneous systems. This was achieved through the design and implementation of a real-time derivative pricing system prototype supported by a parallel and distributed architecture. The parallel architecture was implemented using hybrid parallel programming on CPU and GPU in OpenCL C, Python and R to provide computational acceleration. The GPU implementation resulted to a peak speed acceleration of 541 times by reducing compute time from 46.24 minutes to 5.12 seconds with the dynamic models under stochastic volatility and stochastic interest rates improving pricing accuracy by an aggregate of 46.68% over the Black-Scholes framework. This adopted approach in this thesis is of practical importance in the harnessing idle processor power, reducing the financial institution’s computational resources requirements and provision of accurate and real-time results necessary in trading, hedging, risk assessment and portfolio optimization processes. Keywords: Dynamic copula, empirical dependence, stochastic volatility, stochastic interest rates, jumps, hybrid GPU acceleration, parallelism
- ItemA Framework for evaluating ICT use in teacher educ...Oredo, John OtienoTeachers are under increasing pressure to use Information and Communication Technology to impart to students the knowledge, skills and attitudes they need to survive in the 21st Century. The teaching profession needs to migrate from a teacher centered lecture based instruction, to a student-centered interactive learning environment. To attain this aspiration, an ICT enabled teacher education is fundamental. Towards this end, international and national authorities have been spending huge amounts of money to facilitate the implementation of ICT teacher education. This work attempts to evaluate the ueage of the available ICT facilities in Kenyan Public primary teacher colleges focusing ion the quantity of computer use,and the levels attained in terms of using ICT's support.
- PublicationA Framework to assess the impact of ICT on the livelihoods of students in tertiary institutions: a case of Strathmore UniversityWamicha, Elizabeth; Ateya, Ismail LukanduICT has been considered to influence the livelihood of many people in a number of ways. This has prompted a great number of citizens to take up training in ICT courses so as to harness the supposed livelihood benefits. The research focuses on the impact ICT has on the livelihood of students in tertiary institutions. The study uses the livelihoods model as the conceptual model with vulnerability context, human, social, financial capital of the student and the policies/processes of the tertiary institution as the main variables in developing a framework for the assessment on the impact ICT has on the livelihood of students in tertiary institutions. The developed framework is an extension of the livelihoods model that has been modified to include critical components such us curriculum development, collaboration with industry academic institutions and alumni to overcome the gaps observed that exist within the existing ICT tertiary institution. The administration of the framework is in four parts; the first part is the determination of the vulnerability context within which the student operates; the second part outlines the methods used to maximize livelihood assets of the student; the third part emphasizes on the adjustment of institutional policies and procedures. The fourth part details the incorporation of the livelihood strategies into the tertiary institution and the outcome expected from the framework is strengthened relationships between industry and top universities with increased accountability to stakeholders.
- ItemA Framework to guide companies on adopting cloud computing technologiesBitta, Maurice Nyaoro; Marwanga (Dr.), ReubenCloud computing has emerged as a popular computing model in the Westem world. It is still not well understood by many companies in the developing world that may benefit from its pay-per-use models, and low hardware and software management costs. This dissertation aims at describing Cloud computing, discussing its benefits and barriers, and proposing a framework that small businesses could use to guide them with the adoption of this new computing paradigm. The dissertation deploys the case study as its research methodology. Three small businesses are studied. All three companies are small businesses as per the definition provided by the European Commission. One company is a non-profit, while the other two are for-profit organizations. One of the two for-profit companies operates in an IT intensive industry. The proposed framework is built on the premise that the quality of data collected through qualitative enquiry is sufficient for it to be used for evaluative purposes. Also, although three cases may not be a basis that is large enough for arriving at a scientific conclusion, the research uses Walsham (1993) argument that from an interpretive position, the validity from our extrapolation from these cases depends on the plausibility and cogency of the logical reasoning used in describing the results from the cases, and in drawing conclusions from them. From the research, we discover that businesses perceive Cloud computing to be useful and that they are prepared to face the challenges that hinder its adoption but that they lack a framework to guide them in adopting this technology. This dissertation's key contribution therefore is the proposal of a four-staged framework that could be used to guide small businesses in adopting Cloud computing technologies.
- ItemA Fraud investigative and detective framework in the motor insurance industry: a Kenyan perspectiveKisaka, George Ngosiah; Onyango-Otieno., VitalisInsurance fraud is a serious and growing problem, with fraudsters’ always perfecting their schemes to avoid detection by the basic approaches. This has caused a rise in fraudulent claims that get paid and increased loss ratios for insurance firms thereby diminishing profitability and threatening their very existence. There is widespread recognition that traditional approaches to tackling fraud are inadequate. Studies of insurance fraud have typically focused upon identifying characteristics of fraudulent claims and putting in place different measures to capture them. This thesis proposes an integrated framework to curtail insurance fraud in the Kenyan insurance industry. The research studied existing fraud detection and investigation expertise in depth. The research methodology identified two available theoretical frameworks, the Bayesian Inference Approach and the Mass Detection Tool (MDT). These were compared to comprehensive motor insurance claims fraud management with respect to the insurance industry in Kenya. The findings show that insurance claims’ fraud is indeed prevalent in the Kenyan industry. Sixty five percent of claims processing professionals deem the motor segment as one of the most fraud prone yet a paltry 15 percent of them use technology for fraud detection. This is despite the fact that significant strides have been made in developing systems for fraud detection. These findings were used to determine and propose an integrated ensemble motor insurance fraud detection framework for the Kenyan insurance industry. The proposed framework built up on the mass detection tool (MDT) and provides a solution for preventing, detecting and managing claims fraud in the motor insurance line of business within the Kenyan insurance industry.
- ItemA GIS decision based model for determining the best path for connection to a power distribution network a case study of Kenya power and lighting company limitedKinuthia, Augustine Muturi; Kimani, StephenThe purpose of this study is to present a GIS based decision model for determining the best path for connection to a power distribution network. The model was derived from studies that consider the design of the power distribution system and the GIS field of network analysis along with the method used by KPLC for connecting premises to the distribution network. A digital map of the study area and the distribution network was generated and taking into account the distributors and distribution transformers the best path between the premises and the transformer was derived. In this study it is demonstrated that the distributors’ length and size and the distribution transformers’ capacity, load and location influence the connection of premises to the distribution network. The results also show that combining geospatial methods with the power distribution network enables engineers to visualize the spatial distribution of data in maps which yields better insight into the nature of the power distribution network.