Aims: This study aims at designing and implementing syllabus-oriented question-bank system that is capable of producing paper-based exams with multiple forms along with answer keys. The developed software tool is nam...Aims: This study aims at designing and implementing syllabus-oriented question-bank system that is capable of producing paper-based exams with multiple forms along with answer keys. The developed software tool is named Χ(Chi)-Pro Milestone and supports four types of questions, namely: Multiple-choice, True/False, Short-Answer and Free-Response Essay questions. The study is motivated by the fact that student number in schools and universities is continuously growing at high, non-linear, and uncontrolled rates. This growth, however, is not accompanied by an equivalent growth of educational resources (mainly: instructors, classrooms, and labs). A direct result of this situation is having relatively large number of students in each classroom. It is observed that providing and using online-examining systems could be intractable and expensive. As an alternative, paper-based exams can be used. One main issue is that manually produced paper-based exams are of low quality because of some human factors such as instability and relatively narrow range of topics [1]. Further, it is observed that instructors usually need to spend a lot of time and energy in composing paper-based exams with multiple forms. Therefore, the use of computers for automatic production of paper-based exams from question banks is becoming more and more important. Methodology: The design and evaluation of X-Pro Milestone are done by considering a basic set of design principles that are based on a list of identified Functional and Non-Functional Requirements. Deriving those requirements is made possible by developing X-Pro Milestone using the Iterative and Incremental model from software engineering domain. Results: We demonstrate that X-Pro Milestone has a number of excellent characteristics compared to the exam-preparation and question banks tools available in market. Some of these characteristics are: ease of use and operation, user-friendly interface and good usability, high security and protection of the question bank-items, high stability, and reliability. Further, X-Pro Milestone makes initiating, maintaining and archiving Question-Banks and produced exams possible. Putting X-Pro Milestone into real use has showed that X-Pro Milestone is easy to be learned and effectively used. We demonstrate that X-Pro Milestone is a cost-effective alternative to online examining systems with more and richer features and with low infrastructure requirements.展开更多
This research paper analyzes data breaches in the human service sector. The hypothesis for the solution to this problem is that there will be a significant reduction in data breaches in the human service sector due to...This research paper analyzes data breaches in the human service sector. The hypothesis for the solution to this problem is that there will be a significant reduction in data breaches in the human service sector due to an increase in information assurance. The hypothesis is tested using data from the United States Department of Health and Human Services data breach notification repository during January 2018-December 2020. Our result shows that without the increased mitigation of information assurance, data breaches in the human service sector will continue to increase.展开更多
This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends t...This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends that they are not the same. The concept of cyber security is explored, which goes beyond protecting information resources to include a wider variety of assets, including people [1]. Protecting information assets is the main goal of traditional information security, with consideration to the human element and how people fit into the security process. On the other hand, cyber security adds a new level of complexity, as people might unintentionally contribute to or become targets of cyberattacks. This aspect presents moral questions since it is becoming more widely accepted that society has a duty to protect weaker members of society, including children [1]. The study emphasizes how important cyber security is on a larger scale, with many countries creating plans and laws to counteract cyberattacks. Nevertheless, a lot of these sources frequently neglect to define the differences or the relationship between information security and cyber security [1]. The paper focus on differentiating between cybersecurity and information security on a larger scale. The study also highlights other areas of cybersecurity which includes defending people, social norms, and vital infrastructure from threats that arise from online in addition to information and technology protection. It contends that ethical issues and the human factor are becoming more and more important in protecting assets in the digital age, and that cyber security is a paradigm shift in this regard [1].展开更多
This article explores the evolution of cloud computing, its advantages over traditional on-premises infrastructure, and its impact on information security. The study presents a comprehensive literature review covering...This article explores the evolution of cloud computing, its advantages over traditional on-premises infrastructure, and its impact on information security. The study presents a comprehensive literature review covering various cloud infrastructure offerings and security models. Additionally, it deeply analyzes real-life case studies illustrating successful cloud migrations and highlights common information security threats in current cloud computing. The article concludes by offering recommendations to businesses to protect themselves from cloud data breaches and providing insights into selecting a suitable cloud services provider from an information security perspective.展开更多
Pervasive schemes are the significant techniques that allow intelligent communication among the devices without any human intervention.Recently Internet of Vehicles(IoVs)has been introduced as one of the applications ...Pervasive schemes are the significant techniques that allow intelligent communication among the devices without any human intervention.Recently Internet of Vehicles(IoVs)has been introduced as one of the applications of pervasive computing that addresses the road safety challenges.Vehicles participating within the IoV are embedded with a wide range of sensors which operate in a real time environment to improve the road safety issues.Various mechanisms have been proposed which allow automatic actions based on uncertainty of sensory and managed data.Due to the lack of existing transportation integration schemes,IoV has not been completely explored by business organizations.In order to tackle this problem,we have proposed a novel trusted mechanism in IoV during communication,sensing,and record storing.Our proposed method uses trust based analysis and subjective logic functions with the aim of creating a trust environment for vehicles to communicate.In addition,the subjective logic function is integrated with multi-attribute SAW scheme to improve the decision metrics of authenticating nodes.The trust analysis depends on a variety of metrics to ensure an accurate identification of legitimate vehicles embedded with IoT devices ecosystem.The proposed scheme is determined and verified rigorously through various IoT devices and decision making metrics against a baseline solution.The simulation results show that the proposed scheme leads to 88%improvement in terms of better identification of legitimate nodes,road accidents and message alteration records during data transmission among vehicles as compared to the baseline approach.展开更多
Autonomic software recovery enables software to automatically detect and recover software faults. This feature makes the software to run more efficiently, actively, and reduces the maintenance time and cost. This pape...Autonomic software recovery enables software to automatically detect and recover software faults. This feature makes the software to run more efficiently, actively, and reduces the maintenance time and cost. This paper proposes an automated approach for Software Fault Detection and Recovery (SFDR). The SFDR detects the cases if a fault occurs with software components such as component deletion, replacement or modification, and recovers the component to enable the software to continue its intended operation. The SFDR is analyzed and implemented in parallel as a standalone software at the design phase of the target software. The practical applicability of the proposed approach has been tested by implementing an application demonstrating the performance and effectiveness of the SFDR. The experimental results and the comparisons with other works show the effectiveness of the proposed approach.展开更多
The complexity of computer architectures, software, web applications, and its large spread worldwide using the internet and the rapid increase in the number of users in companion with the increase of maintenance cost ...The complexity of computer architectures, software, web applications, and its large spread worldwide using the internet and the rapid increase in the number of users in companion with the increase of maintenance cost are all factors guided many researchers to develop software, web applications and systems that have the ability of self-healing. The aim of the self healing software feature is to fast recover the application and keep it running and available for 24/7 as optimal as possible. This survey provides an overview of self-healing software and system that is especially useful in all of those situations in which the involvement of humans is costly and hard to recover and needs to be automated with self healing. There are different aspects which will make us understand the different benefits of these self-healing systems. Finally, the approaches, techniques, mechanisms and individual characteristics of self healing are classified in different tables and then summarized.展开更多
A network analyzer can often comprehend many protocols, which enables it to display talks taking place between hosts over a network. A network analyzer analyzes the device or network response and measures for the oper...A network analyzer can often comprehend many protocols, which enables it to display talks taking place between hosts over a network. A network analyzer analyzes the device or network response and measures for the operator to keep an eye on the network’s or object’s performance in an RF circuit. The purpose of the following research includes analyzing the capabilities of NetFlow analyzer to measure various parts, including filters, mixers, frequency sensitive networks, transistors, and other RF-based instruments. NetFlow Analyzer is a network traffic analyzer that measures the network parameters of electrical networks. Although there are other types of network parameter sets including Y, Z, & H-parameters, these instruments are typically employed to measure S-parameters since transmission & reflection of electrical networks are simple to calculate at high frequencies. These analyzers are widely employed to distinguish between two-port networks, including filters and amplifiers. By allowing the user to view the actual data that is sent over a network, packet by packet, a network analyzer informs you of what is happening there. Also, this research will contain the design model of NetFlow Analyzer that Measurements involving transmission and reflection use. Gain, insertion loss, and transmission coefficient are measured in transmission measurements, whereas return loss, reflection coefficient, impedance, and other variables are measured in reflection measurements. These analyzers’ operational frequencies vary from 1 Hz to 1.5 THz. These analyzers can also be used to examine stability in measurements of open loops, audio components, and ultrasonics.展开更多
Developments in service oriented architecture (SOA) have taken us near to the once fictional dream of forming and running an online business, such commercial activity in which most or all of its commercial roles are o...Developments in service oriented architecture (SOA) have taken us near to the once fictional dream of forming and running an online business, such commercial activity in which most or all of its commercial roles are outsourced to online services. The novel concept of cloud computing gives a understanding of SOA in which Information Technology assets are provided as services that are extra flexible, inexpensive and striking to commercial activities. In this paper, we concisely study developments in concept of cloud computing, and debate the advantages of using cloud services for commercial activities and trade-offs that they have to consider. Further we presented a layered architecture for online business, and then we presented a conceptual architecture for complete online business working atmosphere. Moreover, we discuss the prospects and research experiments that are ahead of us in realizing the technical components of this conceptual architecture. We conclude by giving the outlook and impact of cloud services on both large and small businesses.展开更多
The total reliance on internet connectivity and World Wide Web (WWW) based services is forcing many organizations to look for alternative solutions for providing adequate access and response time to the demand of thei...The total reliance on internet connectivity and World Wide Web (WWW) based services is forcing many organizations to look for alternative solutions for providing adequate access and response time to the demand of their ever increasing users. A typical solution is to increase the bandwidth;this can be achieved with additional cost, but this solution does not scale nor decrease users perceived response time. Another concern is the security of their network. An alternative scalable solution is to deploy a proxy server to provide adequate access and improve response time as well as provide some level of security for clients using the network. While some studies have reported performance increase due to the use of proxy servers, one study has reported performance decrease due to proxy server. We then conducted a six-month proxy server experiment. During this period, we collected access logs from three different proxy servers and analyzed these logs with Webalizer a web server log file analysis program. After a few years, in September 2010, we collected log files from another proxy server, analyzed the logs using Webalizer and compared our results. The result of the analysis showed that the hit rate of the proxy servers ranged between 21% - 39% and over 70% of web pages were dynamic. Furthermore clients accessing the internet through a proxy server are more secured. We then conclude that although the nature of the web is changing, the proxy server is still capable of improving performance by decreasing response time perceived by web clients and improved network security.展开更多
Architectural design is a crucial issue in software engineering. It makes testing more effective as it contribute to carry out the testing in an early stage of the software development. To improve software testability...Architectural design is a crucial issue in software engineering. It makes testing more effective as it contribute to carry out the testing in an early stage of the software development. To improve software testability, the software architect should consider different testability metrics while building the software architecture. The main objective of this research is to conduct an early assessment of the software architecture for the purpose of its improvement in order to make the testing process more effective. In this paper, an evaluation model to assess software architecture (Architecture Design Testability Evaluation Model (ADTEM)) is presented. ADTEM is based on two different testability metrics: cohesion and coupling. ADTEM consists of two phases: software architecture evaluation phase, and component evaluation phase. In each phase, a fuzzy inference system is used to perform the evaluation process based on cohesion and coupling testing metrics. The model is validated by using a case study: Elders Monitoring System. The experimental results show that ADTEM is efficient and gave a considerable improvement to the software testability process.展开更多
In this paper, the role of rare or infrequent terms in enhancing the accuracy of English Text Categorization using Polynomial Networks (PNs) is investigated. To study the impact of rare terms in enhancing the accuracy...In this paper, the role of rare or infrequent terms in enhancing the accuracy of English Text Categorization using Polynomial Networks (PNs) is investigated. To study the impact of rare terms in enhancing the accuracy of PNs-based text categorization, different term reduction criteria as well as different term weighting schemes were experimented on the Reuters Corpus using PNs. Each term weighting scheme on each reduced term set was tested once keeping the rare terms and another time removing them. All the experiments conducted in this research show that keeping rare terms substantially improves the performance of Polynomial Networks in Text Categorization, regardless of the term reduction method, the number of terms used in classification, or the term weighting scheme adopted.展开更多
Despite of the advantages of Information and Communication Technology (ICT) which makes our lives easier, faster, and more connected, the development of ICT is pushing the countries in the direction of ICT application...Despite of the advantages of Information and Communication Technology (ICT) which makes our lives easier, faster, and more connected, the development of ICT is pushing the countries in the direction of ICT applications. The higher education is one of the sectors that try to adopt one of ICT applications through E-learning and using (Moodle) as learning management system. This paper finds out the impact of Moodle on students through examining the students’ acceptance of the system using TAM model. This study was carried out by some teaching members of the Faculty of King Abdullah II School for Information Technology at the University of Jordan during the spring semester of the academic year 2013/2014. The results of this study firm the original TAM’s findings and reveal that the faculties of students and number of previously E-learning courses have an influence on perceive ease of use and perceived usefulness while the level of the academic year and GPA have no significant influence on perceived ease of use. Even though, they have affected on perceived usefulness. Finally, the student’s skills on computer with student’s difficulty in reading from the screen affect perceived ease of use but, it has been found that they have no influence on perceived usefulness.展开更多
The rapid changes and increased complexity in today’s world present new challenges and put new demands on the education system. There has been generally a growing awareness of the necessity?to change and improve the ...The rapid changes and increased complexity in today’s world present new challenges and put new demands on the education system. There has been generally a growing awareness of the necessity?to change and improve the existing system towards online learning. Jordan is one of the distinguished countries in the Middle East with rapid progress in education and with advanced teaching and learning technologies. The University of Jordan is trying to exploit Information and Communication Technology (ICT) in education and moving forward by introducing the latest E-learning management systems (LMSs) to keep pace of technological revolution in the higher education. It is?important to find out the impact of E-learning management system in the University of Jordan,?examine the students’ acceptance for this new system and address the challenges facing the students while using the E-learning management system and these are what this paper is trying to do.展开更多
The purpose of this paper is to provide a better knowledge of the cloud computing as well as to suggest relevant research paths in this growing field. Also, we will go through the future benefits of cloud computing an...The purpose of this paper is to provide a better knowledge of the cloud computing as well as to suggest relevant research paths in this growing field. Also, we will go through the future benefits of cloud computing and the upcoming possible challenges we will have. Intext Cloud, performance, cloud computing, architecture, scale-up, and big data are all terms used in this context. Cloud computing offers a wide range of architectural configurations, including the number of processors, memory, and nodes. Cloud computing has already changed the way we store, process, and access data, and it is expected to continue to have a significant impact on the future of information technology. Cloud computing enables organizations to scale their IT resources up or down quickly and easily, without the need for costly hardware upgrades. This can help organizations to respond more quickly to changing business needs and market conditions. By moving IT resources to the cloud, organizations can reduce their IT infrastructure costs and improve their operational efficiency. Cloud computing also allows organizations to pay only for the resources they use, rather than investing in expensive hardware and software licenses. Cloud providers invest heavily in security and compliance measures, which can help to protect organizations from cyber threats and ensure regulatory compliance. Cloud computing provides a scalable platform for AI and machine learning applications, enabling organizations to build and deploy these technologies more easily and cost-effectively. A task, an application, and its input can take up to 20 times longer or cost 10 times more than optimal. Cloud products’ ready adaptability has resulted in a paradigm change. Previously, an application was optimized for a specific cluster;however, in the cloud, the architectural configuration is tuned for the workload. The evolution of cloud computing from the era of mainframes and dumb terminals has been significant, but there are still many advancements to come. As we look towards the future, IT leaders and the companies they serve will face increasingly complex challenges in order to stay competitive in a constantly evolving cloud computing landscape. Additionally, it will be crucial to remain compliant with existing regulations as well as new regulations that may emerge in the future. It is safe to say that the next decade of cloud computing will be just as dramatic as the last where many internet services are becoming cloud-based, and huge enterprises will struggle to fund physical infrastructure. Cloud computing is significantly used in business innovation and because of its agility and adaptability, cloud technology enables new ways of working, operating, and running a business. The service enables users to access files and applications stored in the cloud from anywhere, removing the requirement for users to be always physically close to actual hardware. Cloud computing makes the connection available from anywhere because they are kept on a network of hosted computers that carry data over the internet. Cloud computing has shown to be advantageous to both consumers and corporations. To be more specific, the cloud has altered our way of life. Overall, cloud computing is likely to continue to play a significant role in the future of IT, enabling organizations to become more agile, efficient, and innovative in the face of rapid technological change. This is likely to drive further innovation in AI and machine learning in the coming years.展开更多
The COVID-19 pandemic has had a profound influence on education around the world, with schools and institutions shifting to remote learning to safeguard the safety of students and faculty. Concerns have been expressed...The COVID-19 pandemic has had a profound influence on education around the world, with schools and institutions shifting to remote learning to safeguard the safety of students and faculty. Concerns have been expressed about the impact of virtual learning on student performance and grades. The purpose of this study is to investigate the impact of remote learning on student performance and grades, as well as to investigate the obstacles and benefits of this new educational paradigm. The study will examine current literature on the subject, analyze data from surveys and interviews with students and educators, and investigate potential solutions to improve student performance and participation in virtual classrooms. The study’s findings will provide insights into the effectiveness of remote learning and inform ideas to improve student learning and achievement in an educational virtual world. The purpose of this article is to investigate the influence of remote learning on both students and educational institutions. The project will examine existing literature on the subject and collect data from students, instructors, and administrators through questionnaires and interviews. The paper will look at the challenges and opportunities that remote learning presents, such as the effect on student involvement, motivation, and academic achievement, as well as changes in teaching styles and technology. The outcomes of this study will provide insights into the effectiveness of remote learning and will affect future decisions about the usage of virtual learning environments in education. The research will also investigate potential solutions to improve the quality of remote education and handle any issues that occur.展开更多
Video games have been around for several decades and have had many advancements from the original start of video games. Video games started as virtual games that were advertised towards children, and these virtual gam...Video games have been around for several decades and have had many advancements from the original start of video games. Video games started as virtual games that were advertised towards children, and these virtual games created a virtual reality of a variety of genres. These genres included sports games, such as tennis, football, baseball, war games, fantasy, puzzles, etc. The start of these games was derived from a sports genre and now has a popularity in multiplayer-online-shooting games. The purpose of this paper is to investigate different types of tools available for cheating in virtual world making players have undue advantage over other players in a competition. With the advancement in technology, these video games have become more expanded in the development aspects of gaming. Video game developers have created long lines of codes to create a new look of video games. As video games have progressed, the coding, bugs, bots, and errors of video games have changed throughout the years. The coding of video games has branched out from the original video games, which have given many benefits to this virtual world, while simultaneously creating more problems such as bots. Analysis of tools available for cheating in a game has disadvantaged normal gamer in a fair contest.展开更多
Due to rapid development in software industry, it was necessary to reduce time and efforts in the software development process. Software Reusability is an important measure that can be applied to improve software deve...Due to rapid development in software industry, it was necessary to reduce time and efforts in the software development process. Software Reusability is an important measure that can be applied to improve software development and software quality. Reusability reduces time, effort, errors, and hence the overall cost of the development process. Reusability prediction models are established in the early stage of the system development cycle to support an early reusability assessment. In Object-Oriented systems, Reusability of software components (classes) can be obtained by investigating its metrics values. Analyzing software metric values can help to avoid developing components from scratch. In this paper, we use Chidamber and Kemerer (CK) metrics suite in order to identify the reuse level of object-oriented classes. Self-Organizing Map (SOM) was used to cluster datasets of CK metrics values that were extracted from three different java-based systems. The goal was to find the relationship between CK metrics values and the reusability level of the class. The reusability level of the class was classified into three main categorizes (High Reusable, Medium Reusable and Low Reusable). The clustering was based on metrics threshold values that were used to achieve the experiments. The proposed methodology succeeds in classifying classes to their reusability level (High Reusable, Medium Reusable and Low Reusable). The experiments show how SOM can be applied on software CK metrics with different sizes of SOM grids to provide different levels of metrics details. The results show that Depth of Inheritance Tree (DIT) and Number of Children (NOC) metrics dominated the clustering process, so these two metrics were discarded from the experiments to achieve a successful clustering. The most efficient SOM topology [2 × 2] grid size is used to predict the reusability of classes.展开更多
The concept of Webpage visibility is usually linked to search engine optimization (SEO), and it is based on global in-link metric [1]. SEO is the process of designing Webpages to optimize its potential to rank high on...The concept of Webpage visibility is usually linked to search engine optimization (SEO), and it is based on global in-link metric [1]. SEO is the process of designing Webpages to optimize its potential to rank high on search engines, preferably on the first page of the results page. The purpose of this research study is to analyze the influence of local geographical area, in terms of cultural values, and the effect of local society keywords in increasing Website visibility. Websites were analyzed by accessing the source code of their homepages through Google Chrome browser. Statistical analysis methods were selected to assess and analyze the results of the SEO and search engine visibility (SEV). The results obtained suggest that the development of Web indicators to be included should consider a local idea of visibility, and consider a certain geographical context. The geographical region that the researchers are considering in this research is the Hashemite kingdom of Jordan (HKJ). The results obtained also suggest that the use of social culture keywords leads to increase the Website visibility in search engines as well as localizes the search area such as google.jo, which localizes the search for HKJ.展开更多
文摘Aims: This study aims at designing and implementing syllabus-oriented question-bank system that is capable of producing paper-based exams with multiple forms along with answer keys. The developed software tool is named Χ(Chi)-Pro Milestone and supports four types of questions, namely: Multiple-choice, True/False, Short-Answer and Free-Response Essay questions. The study is motivated by the fact that student number in schools and universities is continuously growing at high, non-linear, and uncontrolled rates. This growth, however, is not accompanied by an equivalent growth of educational resources (mainly: instructors, classrooms, and labs). A direct result of this situation is having relatively large number of students in each classroom. It is observed that providing and using online-examining systems could be intractable and expensive. As an alternative, paper-based exams can be used. One main issue is that manually produced paper-based exams are of low quality because of some human factors such as instability and relatively narrow range of topics [1]. Further, it is observed that instructors usually need to spend a lot of time and energy in composing paper-based exams with multiple forms. Therefore, the use of computers for automatic production of paper-based exams from question banks is becoming more and more important. Methodology: The design and evaluation of X-Pro Milestone are done by considering a basic set of design principles that are based on a list of identified Functional and Non-Functional Requirements. Deriving those requirements is made possible by developing X-Pro Milestone using the Iterative and Incremental model from software engineering domain. Results: We demonstrate that X-Pro Milestone has a number of excellent characteristics compared to the exam-preparation and question banks tools available in market. Some of these characteristics are: ease of use and operation, user-friendly interface and good usability, high security and protection of the question bank-items, high stability, and reliability. Further, X-Pro Milestone makes initiating, maintaining and archiving Question-Banks and produced exams possible. Putting X-Pro Milestone into real use has showed that X-Pro Milestone is easy to be learned and effectively used. We demonstrate that X-Pro Milestone is a cost-effective alternative to online examining systems with more and richer features and with low infrastructure requirements.
文摘This research paper analyzes data breaches in the human service sector. The hypothesis for the solution to this problem is that there will be a significant reduction in data breaches in the human service sector due to an increase in information assurance. The hypothesis is tested using data from the United States Department of Health and Human Services data breach notification repository during January 2018-December 2020. Our result shows that without the increased mitigation of information assurance, data breaches in the human service sector will continue to increase.
文摘This paper examines how cybersecurity is developing and how it relates to more conventional information security. Although information security and cyber security are sometimes used synonymously, this study contends that they are not the same. The concept of cyber security is explored, which goes beyond protecting information resources to include a wider variety of assets, including people [1]. Protecting information assets is the main goal of traditional information security, with consideration to the human element and how people fit into the security process. On the other hand, cyber security adds a new level of complexity, as people might unintentionally contribute to or become targets of cyberattacks. This aspect presents moral questions since it is becoming more widely accepted that society has a duty to protect weaker members of society, including children [1]. The study emphasizes how important cyber security is on a larger scale, with many countries creating plans and laws to counteract cyberattacks. Nevertheless, a lot of these sources frequently neglect to define the differences or the relationship between information security and cyber security [1]. The paper focus on differentiating between cybersecurity and information security on a larger scale. The study also highlights other areas of cybersecurity which includes defending people, social norms, and vital infrastructure from threats that arise from online in addition to information and technology protection. It contends that ethical issues and the human factor are becoming more and more important in protecting assets in the digital age, and that cyber security is a paradigm shift in this regard [1].
文摘This article explores the evolution of cloud computing, its advantages over traditional on-premises infrastructure, and its impact on information security. The study presents a comprehensive literature review covering various cloud infrastructure offerings and security models. Additionally, it deeply analyzes real-life case studies illustrating successful cloud migrations and highlights common information security threats in current cloud computing. The article concludes by offering recommendations to businesses to protect themselves from cloud data breaches and providing insights into selecting a suitable cloud services provider from an information security perspective.
基金funded by the Abu Dhabi University,Faculty Research Incentive Grant(19300483–Adel Khelifi),United Arab Emirates.Link to Sponsor website:https://www.adu.ac.ae/research/research-at-adu/overview.
文摘Pervasive schemes are the significant techniques that allow intelligent communication among the devices without any human intervention.Recently Internet of Vehicles(IoVs)has been introduced as one of the applications of pervasive computing that addresses the road safety challenges.Vehicles participating within the IoV are embedded with a wide range of sensors which operate in a real time environment to improve the road safety issues.Various mechanisms have been proposed which allow automatic actions based on uncertainty of sensory and managed data.Due to the lack of existing transportation integration schemes,IoV has not been completely explored by business organizations.In order to tackle this problem,we have proposed a novel trusted mechanism in IoV during communication,sensing,and record storing.Our proposed method uses trust based analysis and subjective logic functions with the aim of creating a trust environment for vehicles to communicate.In addition,the subjective logic function is integrated with multi-attribute SAW scheme to improve the decision metrics of authenticating nodes.The trust analysis depends on a variety of metrics to ensure an accurate identification of legitimate vehicles embedded with IoT devices ecosystem.The proposed scheme is determined and verified rigorously through various IoT devices and decision making metrics against a baseline solution.The simulation results show that the proposed scheme leads to 88%improvement in terms of better identification of legitimate nodes,road accidents and message alteration records during data transmission among vehicles as compared to the baseline approach.
文摘Autonomic software recovery enables software to automatically detect and recover software faults. This feature makes the software to run more efficiently, actively, and reduces the maintenance time and cost. This paper proposes an automated approach for Software Fault Detection and Recovery (SFDR). The SFDR detects the cases if a fault occurs with software components such as component deletion, replacement or modification, and recovers the component to enable the software to continue its intended operation. The SFDR is analyzed and implemented in parallel as a standalone software at the design phase of the target software. The practical applicability of the proposed approach has been tested by implementing an application demonstrating the performance and effectiveness of the SFDR. The experimental results and the comparisons with other works show the effectiveness of the proposed approach.
文摘The complexity of computer architectures, software, web applications, and its large spread worldwide using the internet and the rapid increase in the number of users in companion with the increase of maintenance cost are all factors guided many researchers to develop software, web applications and systems that have the ability of self-healing. The aim of the self healing software feature is to fast recover the application and keep it running and available for 24/7 as optimal as possible. This survey provides an overview of self-healing software and system that is especially useful in all of those situations in which the involvement of humans is costly and hard to recover and needs to be automated with self healing. There are different aspects which will make us understand the different benefits of these self-healing systems. Finally, the approaches, techniques, mechanisms and individual characteristics of self healing are classified in different tables and then summarized.
文摘A network analyzer can often comprehend many protocols, which enables it to display talks taking place between hosts over a network. A network analyzer analyzes the device or network response and measures for the operator to keep an eye on the network’s or object’s performance in an RF circuit. The purpose of the following research includes analyzing the capabilities of NetFlow analyzer to measure various parts, including filters, mixers, frequency sensitive networks, transistors, and other RF-based instruments. NetFlow Analyzer is a network traffic analyzer that measures the network parameters of electrical networks. Although there are other types of network parameter sets including Y, Z, & H-parameters, these instruments are typically employed to measure S-parameters since transmission & reflection of electrical networks are simple to calculate at high frequencies. These analyzers are widely employed to distinguish between two-port networks, including filters and amplifiers. By allowing the user to view the actual data that is sent over a network, packet by packet, a network analyzer informs you of what is happening there. Also, this research will contain the design model of NetFlow Analyzer that Measurements involving transmission and reflection use. Gain, insertion loss, and transmission coefficient are measured in transmission measurements, whereas return loss, reflection coefficient, impedance, and other variables are measured in reflection measurements. These analyzers’ operational frequencies vary from 1 Hz to 1.5 THz. These analyzers can also be used to examine stability in measurements of open loops, audio components, and ultrasonics.
文摘Developments in service oriented architecture (SOA) have taken us near to the once fictional dream of forming and running an online business, such commercial activity in which most or all of its commercial roles are outsourced to online services. The novel concept of cloud computing gives a understanding of SOA in which Information Technology assets are provided as services that are extra flexible, inexpensive and striking to commercial activities. In this paper, we concisely study developments in concept of cloud computing, and debate the advantages of using cloud services for commercial activities and trade-offs that they have to consider. Further we presented a layered architecture for online business, and then we presented a conceptual architecture for complete online business working atmosphere. Moreover, we discuss the prospects and research experiments that are ahead of us in realizing the technical components of this conceptual architecture. We conclude by giving the outlook and impact of cloud services on both large and small businesses.
文摘The total reliance on internet connectivity and World Wide Web (WWW) based services is forcing many organizations to look for alternative solutions for providing adequate access and response time to the demand of their ever increasing users. A typical solution is to increase the bandwidth;this can be achieved with additional cost, but this solution does not scale nor decrease users perceived response time. Another concern is the security of their network. An alternative scalable solution is to deploy a proxy server to provide adequate access and improve response time as well as provide some level of security for clients using the network. While some studies have reported performance increase due to the use of proxy servers, one study has reported performance decrease due to proxy server. We then conducted a six-month proxy server experiment. During this period, we collected access logs from three different proxy servers and analyzed these logs with Webalizer a web server log file analysis program. After a few years, in September 2010, we collected log files from another proxy server, analyzed the logs using Webalizer and compared our results. The result of the analysis showed that the hit rate of the proxy servers ranged between 21% - 39% and over 70% of web pages were dynamic. Furthermore clients accessing the internet through a proxy server are more secured. We then conclude that although the nature of the web is changing, the proxy server is still capable of improving performance by decreasing response time perceived by web clients and improved network security.
文摘Architectural design is a crucial issue in software engineering. It makes testing more effective as it contribute to carry out the testing in an early stage of the software development. To improve software testability, the software architect should consider different testability metrics while building the software architecture. The main objective of this research is to conduct an early assessment of the software architecture for the purpose of its improvement in order to make the testing process more effective. In this paper, an evaluation model to assess software architecture (Architecture Design Testability Evaluation Model (ADTEM)) is presented. ADTEM is based on two different testability metrics: cohesion and coupling. ADTEM consists of two phases: software architecture evaluation phase, and component evaluation phase. In each phase, a fuzzy inference system is used to perform the evaluation process based on cohesion and coupling testing metrics. The model is validated by using a case study: Elders Monitoring System. The experimental results show that ADTEM is efficient and gave a considerable improvement to the software testability process.
文摘In this paper, the role of rare or infrequent terms in enhancing the accuracy of English Text Categorization using Polynomial Networks (PNs) is investigated. To study the impact of rare terms in enhancing the accuracy of PNs-based text categorization, different term reduction criteria as well as different term weighting schemes were experimented on the Reuters Corpus using PNs. Each term weighting scheme on each reduced term set was tested once keeping the rare terms and another time removing them. All the experiments conducted in this research show that keeping rare terms substantially improves the performance of Polynomial Networks in Text Categorization, regardless of the term reduction method, the number of terms used in classification, or the term weighting scheme adopted.
文摘Despite of the advantages of Information and Communication Technology (ICT) which makes our lives easier, faster, and more connected, the development of ICT is pushing the countries in the direction of ICT applications. The higher education is one of the sectors that try to adopt one of ICT applications through E-learning and using (Moodle) as learning management system. This paper finds out the impact of Moodle on students through examining the students’ acceptance of the system using TAM model. This study was carried out by some teaching members of the Faculty of King Abdullah II School for Information Technology at the University of Jordan during the spring semester of the academic year 2013/2014. The results of this study firm the original TAM’s findings and reveal that the faculties of students and number of previously E-learning courses have an influence on perceive ease of use and perceived usefulness while the level of the academic year and GPA have no significant influence on perceived ease of use. Even though, they have affected on perceived usefulness. Finally, the student’s skills on computer with student’s difficulty in reading from the screen affect perceived ease of use but, it has been found that they have no influence on perceived usefulness.
文摘The rapid changes and increased complexity in today’s world present new challenges and put new demands on the education system. There has been generally a growing awareness of the necessity?to change and improve the existing system towards online learning. Jordan is one of the distinguished countries in the Middle East with rapid progress in education and with advanced teaching and learning technologies. The University of Jordan is trying to exploit Information and Communication Technology (ICT) in education and moving forward by introducing the latest E-learning management systems (LMSs) to keep pace of technological revolution in the higher education. It is?important to find out the impact of E-learning management system in the University of Jordan,?examine the students’ acceptance for this new system and address the challenges facing the students while using the E-learning management system and these are what this paper is trying to do.
文摘The purpose of this paper is to provide a better knowledge of the cloud computing as well as to suggest relevant research paths in this growing field. Also, we will go through the future benefits of cloud computing and the upcoming possible challenges we will have. Intext Cloud, performance, cloud computing, architecture, scale-up, and big data are all terms used in this context. Cloud computing offers a wide range of architectural configurations, including the number of processors, memory, and nodes. Cloud computing has already changed the way we store, process, and access data, and it is expected to continue to have a significant impact on the future of information technology. Cloud computing enables organizations to scale their IT resources up or down quickly and easily, without the need for costly hardware upgrades. This can help organizations to respond more quickly to changing business needs and market conditions. By moving IT resources to the cloud, organizations can reduce their IT infrastructure costs and improve their operational efficiency. Cloud computing also allows organizations to pay only for the resources they use, rather than investing in expensive hardware and software licenses. Cloud providers invest heavily in security and compliance measures, which can help to protect organizations from cyber threats and ensure regulatory compliance. Cloud computing provides a scalable platform for AI and machine learning applications, enabling organizations to build and deploy these technologies more easily and cost-effectively. A task, an application, and its input can take up to 20 times longer or cost 10 times more than optimal. Cloud products’ ready adaptability has resulted in a paradigm change. Previously, an application was optimized for a specific cluster;however, in the cloud, the architectural configuration is tuned for the workload. The evolution of cloud computing from the era of mainframes and dumb terminals has been significant, but there are still many advancements to come. As we look towards the future, IT leaders and the companies they serve will face increasingly complex challenges in order to stay competitive in a constantly evolving cloud computing landscape. Additionally, it will be crucial to remain compliant with existing regulations as well as new regulations that may emerge in the future. It is safe to say that the next decade of cloud computing will be just as dramatic as the last where many internet services are becoming cloud-based, and huge enterprises will struggle to fund physical infrastructure. Cloud computing is significantly used in business innovation and because of its agility and adaptability, cloud technology enables new ways of working, operating, and running a business. The service enables users to access files and applications stored in the cloud from anywhere, removing the requirement for users to be always physically close to actual hardware. Cloud computing makes the connection available from anywhere because they are kept on a network of hosted computers that carry data over the internet. Cloud computing has shown to be advantageous to both consumers and corporations. To be more specific, the cloud has altered our way of life. Overall, cloud computing is likely to continue to play a significant role in the future of IT, enabling organizations to become more agile, efficient, and innovative in the face of rapid technological change. This is likely to drive further innovation in AI and machine learning in the coming years.
文摘The COVID-19 pandemic has had a profound influence on education around the world, with schools and institutions shifting to remote learning to safeguard the safety of students and faculty. Concerns have been expressed about the impact of virtual learning on student performance and grades. The purpose of this study is to investigate the impact of remote learning on student performance and grades, as well as to investigate the obstacles and benefits of this new educational paradigm. The study will examine current literature on the subject, analyze data from surveys and interviews with students and educators, and investigate potential solutions to improve student performance and participation in virtual classrooms. The study’s findings will provide insights into the effectiveness of remote learning and inform ideas to improve student learning and achievement in an educational virtual world. The purpose of this article is to investigate the influence of remote learning on both students and educational institutions. The project will examine existing literature on the subject and collect data from students, instructors, and administrators through questionnaires and interviews. The paper will look at the challenges and opportunities that remote learning presents, such as the effect on student involvement, motivation, and academic achievement, as well as changes in teaching styles and technology. The outcomes of this study will provide insights into the effectiveness of remote learning and will affect future decisions about the usage of virtual learning environments in education. The research will also investigate potential solutions to improve the quality of remote education and handle any issues that occur.
文摘Video games have been around for several decades and have had many advancements from the original start of video games. Video games started as virtual games that were advertised towards children, and these virtual games created a virtual reality of a variety of genres. These genres included sports games, such as tennis, football, baseball, war games, fantasy, puzzles, etc. The start of these games was derived from a sports genre and now has a popularity in multiplayer-online-shooting games. The purpose of this paper is to investigate different types of tools available for cheating in virtual world making players have undue advantage over other players in a competition. With the advancement in technology, these video games have become more expanded in the development aspects of gaming. Video game developers have created long lines of codes to create a new look of video games. As video games have progressed, the coding, bugs, bots, and errors of video games have changed throughout the years. The coding of video games has branched out from the original video games, which have given many benefits to this virtual world, while simultaneously creating more problems such as bots. Analysis of tools available for cheating in a game has disadvantaged normal gamer in a fair contest.
文摘Due to rapid development in software industry, it was necessary to reduce time and efforts in the software development process. Software Reusability is an important measure that can be applied to improve software development and software quality. Reusability reduces time, effort, errors, and hence the overall cost of the development process. Reusability prediction models are established in the early stage of the system development cycle to support an early reusability assessment. In Object-Oriented systems, Reusability of software components (classes) can be obtained by investigating its metrics values. Analyzing software metric values can help to avoid developing components from scratch. In this paper, we use Chidamber and Kemerer (CK) metrics suite in order to identify the reuse level of object-oriented classes. Self-Organizing Map (SOM) was used to cluster datasets of CK metrics values that were extracted from three different java-based systems. The goal was to find the relationship between CK metrics values and the reusability level of the class. The reusability level of the class was classified into three main categorizes (High Reusable, Medium Reusable and Low Reusable). The clustering was based on metrics threshold values that were used to achieve the experiments. The proposed methodology succeeds in classifying classes to their reusability level (High Reusable, Medium Reusable and Low Reusable). The experiments show how SOM can be applied on software CK metrics with different sizes of SOM grids to provide different levels of metrics details. The results show that Depth of Inheritance Tree (DIT) and Number of Children (NOC) metrics dominated the clustering process, so these two metrics were discarded from the experiments to achieve a successful clustering. The most efficient SOM topology [2 × 2] grid size is used to predict the reusability of classes.
文摘The concept of Webpage visibility is usually linked to search engine optimization (SEO), and it is based on global in-link metric [1]. SEO is the process of designing Webpages to optimize its potential to rank high on search engines, preferably on the first page of the results page. The purpose of this research study is to analyze the influence of local geographical area, in terms of cultural values, and the effect of local society keywords in increasing Website visibility. Websites were analyzed by accessing the source code of their homepages through Google Chrome browser. Statistical analysis methods were selected to assess and analyze the results of the SEO and search engine visibility (SEV). The results obtained suggest that the development of Web indicators to be included should consider a local idea of visibility, and consider a certain geographical context. The geographical region that the researchers are considering in this research is the Hashemite kingdom of Jordan (HKJ). The results obtained also suggest that the use of social culture keywords leads to increase the Website visibility in search engines as well as localizes the search area such as google.jo, which localizes the search for HKJ.