Refine your search
Collections
Co-Authors
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z All
Sirohi, Preeti
- Comparative Study of the Search Engines on the Basis of the Relevant Links on the First Web Page
Abstract Views :362 |
PDF Views:2
Authors
Affiliations
1 Department of Computer Science, Institute of Management Sciences, Ghaziabad, Uttar Pradesh, IN
1 Department of Computer Science, Institute of Management Sciences, Ghaziabad, Uttar Pradesh, IN
Source
International Journal of Knowledge Based Computer System, Vol 2, No 1 (2014), Pagination: 33-38Abstract
Web search engines are the keys to the huge knowledge treasure of information and are used to extract query specific information from the complete database. Every search engine uses its own algorithm to rank the relevant links returned by the search engine. It is therefore essential for the user to understand the difference between the search engines in order to attain higher satisfaction level in terms of the result retrieved on the basis of users query. There is great assortment of search engines which offers various options to the web user. Thus it is significant to evaluate and compare the search engines in the quest of the best search engine which will provide the best result in the form of more number of informative links with the relevant result description on the first web page. The purpose of this paper is to compare four major search engines (Yahoo, Google, Ask, and Bing) for their retrieval efficiency on the query topics given from different fields (Computer Science, Physics, Chemistry, and Mathematics). The parameter which is taken to judge the best search engine providing no of relevant link on the first web page is "Number of relevant links returned for the query on the first page" and finding the best search result for the random query topics taken from the different fields (Computer Science, Physics, Mathematics). The research involves real life queries which are used frequently by researchers and academicians on regular basis and the result of the queries are analyzed. Based on the above results, tables are created and analysis is done for evaluating the performance of the four selected search engines. Performance is measured on the quality of the result returned on the first page of the search engine. The analysis of the result is done using the statistical tool-ANOVA (Analysis of Variance). By this analysis we evaluate the relative performance of these search engines.Keywords
Search Engines, Google, Yahoo, World Wide Web.References
- Bharat, K. & Henzinger, M. R. (1998). Improved Algorithms for Topic Distillation in a Hyperlinked Environment.21st ACM SIGIR Conference.
- Courtois, M. P.,Baer, W. M. & Stark, M. (2005). Cool tools for searching the Web: A performance evaluation. Online, 19(6),14-32.
- Fazli, R. & Ayisigi, B.(2004). Automatic performance evaluation of web search engines. Information Processing and Management, January, 40(3), 495-514.
- Haveliwala, T. H. (2002). Topic Sensitive Page Rank.In Proceedings of the Eleventh International World Wide Web Conference.
- iProspect Search Engine User Behavior Study. (2006). A report byiProspect and Jupiter Research, January. www.iprospect.com
- Marchionini, G. (1992). Interfaces for end-user information seeking. Journal of the American Society for Information Science, 43(2),156-163.
- Martin, P. C., William, M. B. & Marcella, S. (2004). Cool tools for searching the web:Performance Evaluation. Online, 19(6), 14-32.
- Page, L., Brin, S., Motwani, R. & Winograd, T. (2002). The Page Rank Citation Ranking: Bringing order to the web. Stanford Digital Libraries Working Paper, 1998.
- Cloud Service Selection and Experimentation:EAGLTS
Abstract Views :235 |
PDF Views:0
Authors
Affiliations
1 Institute of Management Studies, Ghaziabad, U.P., IN
2 UPES, Dehradun, Uttrakhand, IN
3 Amity University, Dubai, AE
1 Institute of Management Studies, Ghaziabad, U.P., IN
2 UPES, Dehradun, Uttrakhand, IN
3 Amity University, Dubai, AE
Source
Journal of IMS Group, Vol 15, No 1 (2018), Pagination: 73-78Abstract
Cloud computing technology has captured the market these days. Cloud resources are offered as services and the payment is based on their per usage consumption. The increasing number of cloud vendors offers different services having different parameters and with varied quality. Cloud users according to the requirement rents the services which can fulfills their application demands. The challenge arises how to select the services which can meet the criteria of the user. In this paper we present the cloud service ranking algorithm which show the process of effective ranking of cloud services.Keywords
Cloud Computing, Multi-Objective Optimization, Quality of Service (QoS).References
- B.P. Rimal, E. Choi,and I.Lumb, I.”A Taxonomy and Survey of Cloud Computing Systems,”Proc. 5thInt. Joint Conf. INC, IMS and IDC(NCM’09),pp. 44-51, 2009.
- Lin, Wenmin, et al. “A QoS aware service discovery method for elastic cloud computing in an unstructured peer to peer network.” Concurrency and Computation: Practice and Experience 25.13 (2013): 1843-1860
- Jula, Amin, Elankovan Sundararajan, and Zalinda Othman. “Cloud computing service composition: A systematic literature review.” Expert Systems with Applications 41.8 (2014): 3809-3824.
- Garg, Saurabh Kumar, Steve Versteeg, and RajkumarBuyya. “Smicloud: A framework for comparing and ranking cloud services.” Utility and Cloud Computing (UCC), 2011 Fourth IEEE International Conference on. IEEE, 2011
- Ding S, Yang S, Zhang Y, Liang C, Xia C (2014) Combining QoS prediction and customer satisfaction estimation to solve cloud service trustworthiness evaluation problems. Knowled Based Syst 56:216
- Zheng, Zibin, et al. “QoS ranking prediction for cloud services.” IEEE transactions on parallel and distributed systems 24.6 (2013): 1213-1222.
- Chen, Chen-Tung, and Kuan-Hung Lin. “A decision-making method based on interval-valued fuzzy sets for cloud service evaluation.” New Trends in Information Science and Service Science (NISS), 2010 4th International Conference on. IEEE, 2010.
- Turskis, Zenonas, Edmundas Kazimieras Zavadskas, and FriedelPeldschus. “Multi-criteria optimization system for decision making in construction design and management.” Engineering economics 61.1 (2015).
- Qu, Lie, Yan Wang, and Mehmet A. Orgun. “Cloud service selection based on the aggregation of user feedback and quantitative performance assessment.” Services computing (scc), 2013 ieee international conference on. IEEE, 2013.
- . Katsaros, Gregory, et al. “A service framework for energy-aware monitoring and VM management in Clouds.” Future Generation Computer Systems 29.8 (2013): 2077-2091.
- . Chan, Hoi, and Trieu Chieu. “Ranking and mapping of applications to cloud computing services by SVD.” Network Operations and Management Symposium Workshops (NOMS Wksps), 2010 IEEE/IFIP. IEEE, 2010.
- . Garg, Saurabh Kumar, Steve Versteeg, and RajkumarBuyya. “A framework for ranking of cloud computing services.” Future Generation Computer Systems 29.4 (2013): 1012-1023.
- . Siegel, Jane, and Jeff Perdue. “Cloud services measures for global use: the service measurement index (SMI).” SRII Global Conference (SRII), 2012 Annual. IEEE, 2012.
- . Katsaros, Gregory, et al. “A service framework for energy-aware monitoring and VM management in Clouds.” Future Generation Computer Systems 29.8 (2013): 2077-2091.
- . Bubak, Marian, et al. “Evaluation of cloud providers for VPH applications.” Cluster, Cloud and Grid Computing (CCGrid), 2013 13th IEEE/ACM International Symposium on. IEEE, 2013.
- . Zissis, Dimitrios, and DimitriosLekkas. “Securing e-Government and e-Voting with an open cloud computing architecture.” Government Information Quarterly 28.2 (2011): 239-251.
- . Leitner, Philipp, and Jürgen Cito. “Patterns in the chaos—a study of performance variation and predi ctability in public iaas clouds.” ACM Transactions on Internet Technology (TOIT)16.3 (2016): 15.
- . Pan, Yuchen, et al. “Trust-enhanced cloud service selection model based on QoS analysis.” PloS one 10.11 (2015): e0143448.
- . Elmubarak, Sahar Abdalla, AdilYousif, and Mohammed Bakri Bashir. “Performance based Ranking Model for Cloud SaaS Services.” (2017).
- . Liao, Ying-Hong, and Chuen-Tsai Sun. “An educational genetic algorithms learning tool.” IEEE transactions on Education 44.2 (2001): 20-pp.