Open Access Original Research Article

Software Piracy in Nigeria

David Roland Andembubtob, Jonathan Iliya Apuru, Siyani Dogo Ezra

Asian Journal of Research in Computer Science, Volume 6, Issue 1, Page 1-13
DOI: 10.9734/ajrcos/2020/v6i130148

The enormity of software piracy varies in different countries and Nigeria is not an exception. Software piracy occurs in diverse forms such as soft lifting, hard disk loading, counterfeiting and unauthorized renting. Many reasons are behind software piracy. Software piracy has many negative economic consequences: Competition distorted by pirated software at the expense of local industries, loss of tax revenue and jobs because of the lack of a legitimate market, increase in cost recovery which, overall, affects social well-being of the citizenry. Findings have revealed that Nigeria has the highest case of software piracy, intellectual property theft and other sharp practices in the IT industry in Africa. Hence, this work investigates software piracy in Nigeria; revealing the concepts, causes, effects; and proffered solutions. We adopted the descriptive survey design. The research instrument used was online questionnaire with a sample size of 3270 people drawn from a population of respondents from the six (6) Geopolitical Zones of Nigeria. The results show that software piracy has statistically significant effect on the economy of Nigeria and that high standard of living is the biggest cause of software piracy in Nigeria.

Open Access Original Research Article

An Effective Algorithm for Denoising Salt-And-Pepper Noise in Real-Time

Obed Appiah, James Benjamin Hayfron-Acquah, Michael Asante

Asian Journal of Research in Computer Science, Volume 6, Issue 1, Page 14-27
DOI: 10.9734/ajrcos/2020/v6i130149

For computer vision systems to effectively perform diagnoses, identification, tracking, monitoring and surveillance, image data must be devoid of noise. Various types of noises such as Salt-and-pepper or Impulse, Gaussian, Shot, Quantization, Anisotropic, and Periodic noises corrupts images making it difficult to extract relevant information from them. This has led to a lot of proposed algorithms to help fix the problem. Among the proposed algorithms, the median filter has been successful in handling salt-and-pepper noise and preserving edges in images. However, its moderate to high running time and poor performance when images are corrupted with high densities of noise, has led to various proposed modifications of the median filter. The challenge observed with all these modifications is the trade-off between efficient running time and quality of denoised images. This paper proposes an algorithm that delivers quality denoised images in low running time. Two state-of-the-art algorithms are combined into one and a technique called Mid-Value-Decision-Median introduced into the proposed algorithm to deliver high quality denoised images in real-time. The proposed algorithm, High-Performance Modified Decision Based Median Filter (HPMDBMF) runs about 200 times faster than the state-of-the-art Modified Decision Based Median Filter (MDBMF) and still generate equivalent output.

Open Access Original Research Article

Advantage to Short Circuit Current of Calculation on Power System by the MVA Method

Ming-Jong Lin

Asian Journal of Research in Computer Science, Volume 6, Issue 1, Page 28-36
DOI: 10.9734/ajrcos/2020/v6i130150

The short circuit current of the equipment must be carefully calculated. If it is not effective to isolate the fault point, while the system is abnormal. That will leads to prolonged fault time and the instability of power system. The calculation process is very so complicated that must to establish any capacity of equipment on system. This article will introduce method of the traditional algorithm beginning. Next, to establish the basic principles of simplified procedures such as the capacity of two parallel devices are considered as a series method (plus), but the capacity of two series devices are considered as a parallel method to calculate. Finally, to calculate the fault short-circuits current based on the actual equipment capacity on the power system. Another technique is to transform the structure of transmission line from △ to Y so that it easy to simplify for calculation. It is proved to be a simple and easy to understand what method for short circuit fault current of calculation. The method can provide accurate and effective calculation of system short circuit capacity and current at each equipment point, so that designer can make appropriate specifications in design and procurement to avoid wasteful investment.

Open Access Original Research Article

Time Series Predictive Models for Social Networking Media Usage Data: The Pragmatics and Projections

M. A. Jayaram, Gayitri Jayatheertha, Ritu Rajpurohit

Asian Journal of Research in Computer Science, Volume 6, Issue 1, Page 37-54
DOI: 10.9734/ajrcos/2020/v6i130151

Aims: We have set forth three main objectives in the work presented in this paper, they are namely, to study how social networking media usage is surging over the time for three social media networks viz., Facebook, Twitter and LinkedIn, ii.to develop best fitting time series predictive models for predicting future usage of three network media  and, iii. to make a comparative analysis to herald the ups and downs noticed in the usage across three network media considered.

Study Design: Application of time series techniques for the analysis of social network user’s data. The main research question addressed by this work is to see how time series models augurs for time dependent data such as the one chosen in this research.

Place and Duration of Study: Research Center, Department of Master of Computer Applications, Siddaganga Institute of Technology, Tumakuru, Karnataka, India, between January 2020- April 2020.

Methodology: The work delved on collection three social network users (Facebook, LinkedIn, and Twitter) data for a span of nine years i.e., for the tenure 2011-2019. One dimensional, two dimensional and three dimensional visual analytics is made prior to time series analysis. Time series predictive analytics involved development of best fits for prediction. To select the best fits among linear, polynomial, exponential, power function and logarithmic models, mean absolute error and root mean square error metrics were used.

Results: Linear, polynomial function trend lines proved to be the best for Facebook, LinkedIn and Twitter respectively with low values of MAE and RMSE and high values of regression coefficients as compared with other kinds of models. Apart from the error metrics, the Theil’s U-statistic values of 0.928, 1.008 and 1.21 for Facebook, Twitter and LinkedIn also heralded the fact that these functions are superior models when compared with other naïve models. It is also projected that by 2025, Facebook will see 10,000 billion, followed by LinkedIn at 1500 billion while Twitter would see 750 billion people if same kind of surge trend prevails in user numbers across three networks considered in this research.

Conclusion: This paper presented a unique work which is supposedly deemed to be the first of its kind to the best of the knowledge of authors. The models come with a limitation that, they can provide accurate projection if the same trend prevails in the pattern of upheavals in usage.

Open Access Original Research Article

Modified Ratio-Cum-Product Estimators of Population Mean Using Two Auxiliary Variables

J. O. Muili, E. N. Agwamba, Y. A. Erinola, M. A. Yunusa, A. Audu, M. A. Hamzat

Asian Journal of Research in Computer Science, Volume 6, Issue 1, Page 55-65
DOI: 10.9734/ajrcos/2020/v6i130152

A percentile is one of the measures of location used by statisticians showing the value below which a given percentage of observations in a group of observations fall. A family of ratio-cum-product estimators for estimating the finite population mean of the study variable when the finite population mean of two auxiliary variables are known in simple random sampling without replacement (SRSWOR) have been proposed. The main purpose of this study is to develop new ratio-cum-product estimators in order to improve the precision of estimation of population mean in sample random sampling without replacement using information of percentiles with two auxiliary variables. The expressions of the bias and mean square error (MSE) of the proposed estimators were derived by Taylor series method up to first degree of approximation. The efficiency conditions under which the proposed ratio-cum-product estimators are better than sample man, ratio estimator, product estimator and other estimators considered in this study have been established. The numerical and empirical results show that the proposed estimators are more efficient than the sample mean, ratio estimator, product estimator and other existing estimators.