Contingent self-esteem captures the fragile nature of self-esteem and is often regarded as suboptimal to psychological functioning.Self-compassion is another important self-related concept assumed to promote mental he...Contingent self-esteem captures the fragile nature of self-esteem and is often regarded as suboptimal to psychological functioning.Self-compassion is another important self-related concept assumed to promote mental health and well-being.However,research on the relation of self-compassion to contingent self-esteem is lacking.Two studies were conducted to explore the role of selfcompassion,either as a personal characteristic or an induced mindset,in influencing the effects of contingent self-esteem on well-being.Study 1 recruited 256 Chinese college students(30.4%male,mean age=21.72 years)who filled out measures of contingent self-esteem,self-compassion,and well-being.The results found that self-compassion moderated the effect of contingent self-esteem on well-being.In Study 2,a sample of 90 Chinese college students(34%male,mean age=18.39 years)were randomly assigned to either a control or self-compassion group.They completed baseline trait measures of contingent self-esteem,self-compassion,and self-esteem.Then,they were led to have a 12-min break(control group)or listen to a 12-min self-compassion audio(self-compassion group),followed by a social stress task and outcome measures.The results demonstrated the effectiveness of the brief self-compassion training and its moderating role in influencing the effects of contingent self-esteem on negative affects after the social stress task.This research provides implications that to equip with a self-compassionate mindset could lower the risk of the impairment of well-being associated with elements of contingent selfesteem,which involves a fragile sense of self-worth.It may also provide insights into the development of an“optimal selfesteem”and the improvement of well-being.展开更多
A systematic study was carried out to assess the level of contamination with fluorides and heavy metals in the drinking water of the city of Daloa as well as the risks to the health of consumers. The waters of 11.11% ...A systematic study was carried out to assess the level of contamination with fluorides and heavy metals in the drinking water of the city of Daloa as well as the risks to the health of consumers. The waters of 11.11% of the sites sampled exceeded the fluoride limit for drinking water with a contamination index (CI) greater than 0. All the waters recorded concentrations of cadmium (Cd), copper (Cu), iron (Fe), manganese (Mn) and lead (Pb) above the recommended values with CI > 0. However, 22.22% of the sites recorded concentrations below the standard for zinc (Zn) with IC < 0. The assessment of adverse effects on human health showed that the chronic daily intake (CDI) of fluorine and metals was less than 1 (CDI < 1) for both adults and children except for Zn where the CDI > 1 for children in 22.22% of drinking water studied. HQs have an average of less than 1 for fluorine and greater than 1 for all metals. Moreover, the danger indices have values greater than 1. The incremental lifetime cancer risk (ILCR) and the total ILCR are above the recommended values. These results showed that the drinking water sampled is of poor quality due to higher levels of heavy metals, which can constitute a danger to human health. Long-term use of one of these poor quality waters can lead to cancer in consumers. It is therefore necessary to treat this water in order to eliminate the metals before using it for drinking. This study can help decision-makers and competent authorities in charge of water management.展开更多
We propose a new clustering algorithm that assists the researchers to quickly and accurately analyze data. We call this algorithm Combined Density-based and Constraint-based Algorithm (CDC). CDC consists of two phases...We propose a new clustering algorithm that assists the researchers to quickly and accurately analyze data. We call this algorithm Combined Density-based and Constraint-based Algorithm (CDC). CDC consists of two phases. In the first phase, CDC employs the idea of density-based clustering algorithm to split the original data into a number of fragmented clusters. At the same time, CDC cuts off the noises and outliers. In the second phase, CDC employs the concept of K-means clustering algorithm to select a greater cluster to be the center. Then, the greater cluster merges some smaller clusters which satisfy some constraint rules. Due to the merged clusters around the center cluster, the clustering results show high accuracy. Moreover, CDC reduces the calculations and speeds up the clustering process. In this paper, the accuracy of CDC is evaluated and compared with those of K-means, hierarchical clustering, and the genetic clustering algorithm (GCA) proposed in 2004. Experimental results show that CDC has better performance.展开更多
In digital fingerprinting, preventing piracy of images by colluders is an important and tedious issue. Each image will be embedded with a unique User IDentification (UID) code that is the fingerprint for tracking th...In digital fingerprinting, preventing piracy of images by colluders is an important and tedious issue. Each image will be embedded with a unique User IDentification (UID) code that is the fingerprint for tracking the authorized user. The proposed hiding scheme makes use of a random number generator to scramble two copies of a UID, which will then be hidden in the randomly selected medium frequency coefficients of the host image. The linear support vector machine (SVM) will be used to train classifications by calculating the normalized correlation (NC) for the 2class UID codes. The trained classifications will be the models used for identifying unreadable UID codes. Experimental results showed that the success of predicting the unreadable UID codes can be increased by applying SVM. The proposed scheme can be used to provide protections to intellectual property rights of digital images aad to keep track of users to prevent collaborative piracies.展开更多
Recently,a reversible image transformation(RIT)technology that transforms a secret image to a freely-selected target image is proposed.It not only can generate a stego-image that looks similar to the target image,but ...Recently,a reversible image transformation(RIT)technology that transforms a secret image to a freely-selected target image is proposed.It not only can generate a stego-image that looks similar to the target image,but also can recover the secret image without any loss.It also has been proved to be very useful in image content protection and reversible data hiding in encrypted images.However,the standard deviation(SD)is selected as the only feature during the matching of the secret and target image blocks in RIT methods,the matching result is not so good and needs to be further improved since the distributions of SDs of the two images may be not very similar.Therefore,this paper proposes a Gray level co-occurrence matrix(GLCM)based approach for reversible image transformation,in which,an effective feature extraction algorithm is utilized to increase the accuracy of blocks matching for improving the visual quality of transformed image,while the auxiliary information,which is utilized to record the transformation parameters,is not increased.Thus,the visual quality of the stego-image should be improved.Experimental results also show that the root mean square of stego-image can be reduced by 4.24%compared with the previous method.展开更多
Performing analytics on the load curve(LC)of customers is the foundation for demand response which requires a better understanding of customers'consumption pattern(CP)by analyzing the load curve.However,the perfor...Performing analytics on the load curve(LC)of customers is the foundation for demand response which requires a better understanding of customers'consumption pattern(CP)by analyzing the load curve.However,the performances of previous widely-used LC clustering methods are poor in two folds:larger number of clusters,huge variances within a cluster(a CP is extracted from a cluster),bringing huge difficulty to understand the electricity consumption pattern of customers.In this paper,to improve the performance of LC clustering,a clustering framework incorporated with community detection is proposed.The framework includes three parts:network construction,community detection,and CP extraction.According to the cluster validity index(CVI),the integrated approach outperforms the previous state-of-the-art method with the same amount of clusters.And the approach needs fewer clusters to achieve the same performance measured by CVI.展开更多
Aiming at the tele-operation instability caused by time delay of interuet information transfer for internet based tele-robotics, this paper proposes a novel control framework for internet based tele-roboties, which ca...Aiming at the tele-operation instability caused by time delay of interuet information transfer for internet based tele-robotics, this paper proposes a novel control framework for internet based tele-roboties, which can guarantee the non-distortion-transfer of control information and reduce the difference of action time between the local simulated virtual robot and the remote real robot. This framework is insensitive to the inherent interact time delay, and differs from other tele-robotics systems that try to use some mathematic models to describe the internet delay or take some assumptions. In order to verify the framework, a 4-DOF fischertechnik industry robot tele-operation system has been developed using the new proposed framework. Experimental results demonstrate the applicable performance of the new framework. The framework is open structured and can be applied to other general purposed tele-operation systems.展开更多
Suppose compact sets E and F are quasi uniformly disconnected and quasi Ahlfors-David regular.This paper proves that E and F are quasi-Lipschitz equivalent if and only if they have the same Hausdorff dimension.
A new device used to detect the low contrast target acquisition of photoelectric theodolite is designed and its reliability is experimentally demonstrated.The adjustable contrast optical target device,which can simula...A new device used to detect the low contrast target acquisition of photoelectric theodolite is designed and its reliability is experimentally demonstrated.The adjustable contrast optical target device,which can simulate the sky background luminance and a low contrast target,is established.It utilizes a big integrating sphere and a small one to simulate the luminance of the background and target respectively.Importantly,by controlling the luminous flux of the two integrating spheres,the targets and background radiance can be continuously adjustable under the condition of constant color temperature.Thus,the contrast can be controlled continuously in the range of 0%-90% and its stability is better than 1%.The biggest background luminance exceeds 60 W m-2str-1 in the spectral range of 400-800 nm.展开更多
As cloud computing technology turning to mature,cloud services have become a trust-based service.Users'distrust of the security and performance of cloud services will hinder the rapid deployment and development of...As cloud computing technology turning to mature,cloud services have become a trust-based service.Users'distrust of the security and performance of cloud services will hinder the rapid deployment and development of cloud services.So cloud service providers(CSPs)urgently need a way to prove that the infrastructure and the behavior of cloud services they provided can be trusted.The challenge here is how to construct a novel framework that can effective verify the security conformance of cloud services,which focuses on fine-grained descriptions of cloud service behavior and security service level aggreements(SLAs).In this paper,we propose a novel approach to verify cloud service security conformance,which reduces the description gap between the CSP and users through modeling cloud service behavior and security SLA,these models enable a systematic integration of security constraints and service behavior into cloud while using UPPAAL to check the performance and security conformance.The proposed approach is validated through case study and experiments with real cloud service based on Open-Stack,which illustrates CloudSec approach effectiveness and can be applied on realistic cloud scenario.展开更多
To improve the efficiency of evolutionary algorithms(EAs)for solving complex problems with large populations,this paper proposes a scalable parallel evolution optimization(SPEO)framework with an elastic asynchronous m...To improve the efficiency of evolutionary algorithms(EAs)for solving complex problems with large populations,this paper proposes a scalable parallel evolution optimization(SPEO)framework with an elastic asynchronous migration(EAM)mechanism.SPEO addresses two main challenges that arise in large-scale parallel EAs:(1)heavy communication workload from extensive information exchange across numerous processors,which reduces computational efficiency,and(2)loss of population diversity due to similar solutions generated and shared by many processors.The EAM mechanism introduces a self-adaptive communication scheme to mitigate communication overhead,while a diversity-preserving buffer helps maintain diversity by filtering similar solutions.Experimental results on eight CEC2014 benchmark functions using up to 512 CPU cores on the Australian National Computational Infrastructure(NCI)platform demonstrate that SPEO not only scales efficiently with an increasing number of processors but also achieves improved solution quality compared to state-of-the-art island-based EAs.展开更多
Nearest Neighbor (κNN) search is one of the most important operations in spatial and spatio-temporal databases. Although it has received considerable attention in the database literature, there is little prior work...Nearest Neighbor (κNN) search is one of the most important operations in spatial and spatio-temporal databases. Although it has received considerable attention in the database literature, there is little prior work on κNN retrieval for moving object trajectories. Motivated by this observation, this paper studies the problem of efficiently processing κNN (κ≥ 1) search on R-tree-like structures storing historical information about moving object trajectories. Two algorithms are developed based on best-first traversal paradigm, called BFPκNN and BFTκNN, which handle the κNN retrieval with respect to the static query point and the moving query trajectory, respectively. Both algorithms minimize the number of node access, that is, they perform a single access only to those qualifying nodes that may contain the final result. Aiming at saving main-memory consumption and reducing CPU cost further, several effective pruning heuristics are also presented. Extensive experiments with synthetic and real datasets confirm that the proposed algorithms in this paper outperform their competitors significantly in both efficiency and scalability.展开更多
The minimum independent dominance set(MIDS)problem is an important version of the dominating set with some other applications.In this work,we present an improved master-apprentice evolutionary algorithm for solving th...The minimum independent dominance set(MIDS)problem is an important version of the dominating set with some other applications.In this work,we present an improved master-apprentice evolutionary algorithm for solving the MIDS problem based on a path-breaking strategy called MAE-PB.The proposed MAE-PB algorithm combines a construction function for the initial solution generation and candidate solution restarting.It is a multiple neighborhood-based local search algorithm that improves the quality of the solution using a path-breaking strategy for solution recombination based on master and apprentice solutions and a perturbation strategy for disturbing the solution when the algorithm cannot improve the solution quality within a certain number of steps.We show the competitiveness of the MAE-PB algorithm by presenting the computational results on classical benchmarks from the literature and a suite of massive graphs from real-world applications.The results show that the MAE-PB algorithm achieves high performance.In particular,for the classical benchmarks,the MAE-PB algorithm obtains the best-known results for seven instances,whereas for several massive graphs,it improves the best-known results for 62 instances.We investigate the proposed key ingredients to determine their impact on the performance of the proposed algorithm.展开更多
Mobile phone localization plays a key role in the fast-growing location-based applications domain. Most of the existing localization schemes rely on infrastructure support such as GSM, Wi-Fi or GPS. In this paper, we ...Mobile phone localization plays a key role in the fast-growing location-based applications domain. Most of the existing localization schemes rely on infrastructure support such as GSM, Wi-Fi or GPS. In this paper, we present FTrack, a novel floor localization system to identify the floor level in a multi-floor building on which a mobile user is located. FTrack uses the mobile phone's sensors only without any infrastructure support. It does not require any prior knowledge of the building such as floor height or floor levels. Through crowdsourcing, FTrack builds a mapping table which contains the magnetic field signature of users taking the elevator/escalator or walking on the stairs between any two floors. The table can then be used for mobile users to pinpoint their current floor levels. We conduct both simulation and field studies to demonstrate the eiTiciency, scalability and robustness of FTrack. Our field trial shows that FTrack achieves an accuracy of over 96% in three different buildings.展开更多
Online prediction is a process that repeatedly predicts the next element in the coming period from a sequence of given previous elements. This process has a broad range of applications in various areas, such as medica...Online prediction is a process that repeatedly predicts the next element in the coming period from a sequence of given previous elements. This process has a broad range of applications in various areas, such as medical, streaming media, and finance. The greatest challenge for online prediction is that the sequence data may not have explicit features because the data is frequently updated, which means good predictions are difficult to maintain. One of the popular solutions is to make the prediction with expert advice, and the challenge is to pick the right experts with minimum cumulative loss. In this research, we use the forex trading prediction, which is a good example for online prediction, as a case study. We also propose an improved expert selection model to select a good set of forex experts by learning previously observed sequences. Our model considers not only the average mistakes made by experts, but also the average profit earned by experts, to achieve a better performance, particularly in terms of financial profit. We demonstrate the merits of our model on two real major currency pairs corpora with extensive experiments.展开更多
Users are vulnerable to privacy risks when providing their location information to location-based services (LBS). Existing work sacrifices the quality of LBS by degrading spatial and temporal accuracy for ensuring u...Users are vulnerable to privacy risks when providing their location information to location-based services (LBS). Existing work sacrifices the quality of LBS by degrading spatial and temporal accuracy for ensuring user privacy. In this paper, we propose a novel approach, Complete Bipartite Anonymity (CBA), aiming to achieve both user privacy and quality of service. The theoretical basis of CBA is that: if the bipartite graph of k nearby users' paths can be transformed into a complete bipartite graph, then these users achieve k-anonymity since the set of "end points connecting to a specific start point in a graph" is an equivalence class. To achieve CBA, we design a Collaborative Path Confusion (CPC) protocol which enables nearby nsers to discover and authenticate each other without knowing their real identities or accurate locations, predict tile encounter location using users' moving pattern information, and generate fake traces obfuscating the real ones. We evaluate CBA using a real-world dataset, and compare its privacy performance with existing path confusion approach. The results show that CBA enhances location privacy by increasing the chance for a user confusing his/her path with others by 4 to 16 times in low user density areas. We also demonstrate that CBA is secure under the trace identification attack.展开更多
The maximal matching problem (MMP) is to find maximal edge subsets in a given undirected graph, that no pair of edges are adjacent in the subsets. It is a vitally important NP-complete problem in graph theory and ap...The maximal matching problem (MMP) is to find maximal edge subsets in a given undirected graph, that no pair of edges are adjacent in the subsets. It is a vitally important NP-complete problem in graph theory and applied mathematics, having numerous real life applications in optimal combination and linear programming fields. It can be difficultly solved by the electronic computer in exponential level time. Meanwhile in previous studies deoxyribonucleic acid (DNA) molecular operations usually were used to solve NP-complete continuous path search problems, e.g. HPP, traveling salesman problem, rarely for NP-hard problems with discrete vertices or edges solutions, such as the minimum vertex cover problem, graph coloring problem and so on. In this paper, we present a DNA algorithm for solving the MMP with DNA molecular operations. For an undirected graph with n vertices and m edges, we reasonably design fixed length DNA strands representing vertices and edges of the graph, take appropriate steps and get the solutions of the MMP in proper length range using O(n^3) time. We extend the application of DNA molecular operations and simultaneously simplify the complexity of the computation.展开更多
基金the Jilin Science and Technology Department 20200201280JC,and Shanghai special fund for ideological and political work in Shanghai University of International Business and Economics.
文摘Contingent self-esteem captures the fragile nature of self-esteem and is often regarded as suboptimal to psychological functioning.Self-compassion is another important self-related concept assumed to promote mental health and well-being.However,research on the relation of self-compassion to contingent self-esteem is lacking.Two studies were conducted to explore the role of selfcompassion,either as a personal characteristic or an induced mindset,in influencing the effects of contingent self-esteem on well-being.Study 1 recruited 256 Chinese college students(30.4%male,mean age=21.72 years)who filled out measures of contingent self-esteem,self-compassion,and well-being.The results found that self-compassion moderated the effect of contingent self-esteem on well-being.In Study 2,a sample of 90 Chinese college students(34%male,mean age=18.39 years)were randomly assigned to either a control or self-compassion group.They completed baseline trait measures of contingent self-esteem,self-compassion,and self-esteem.Then,they were led to have a 12-min break(control group)or listen to a 12-min self-compassion audio(self-compassion group),followed by a social stress task and outcome measures.The results demonstrated the effectiveness of the brief self-compassion training and its moderating role in influencing the effects of contingent self-esteem on negative affects after the social stress task.This research provides implications that to equip with a self-compassionate mindset could lower the risk of the impairment of well-being associated with elements of contingent selfesteem,which involves a fragile sense of self-worth.It may also provide insights into the development of an“optimal selfesteem”and the improvement of well-being.
文摘A systematic study was carried out to assess the level of contamination with fluorides and heavy metals in the drinking water of the city of Daloa as well as the risks to the health of consumers. The waters of 11.11% of the sites sampled exceeded the fluoride limit for drinking water with a contamination index (CI) greater than 0. All the waters recorded concentrations of cadmium (Cd), copper (Cu), iron (Fe), manganese (Mn) and lead (Pb) above the recommended values with CI > 0. However, 22.22% of the sites recorded concentrations below the standard for zinc (Zn) with IC < 0. The assessment of adverse effects on human health showed that the chronic daily intake (CDI) of fluorine and metals was less than 1 (CDI < 1) for both adults and children except for Zn where the CDI > 1 for children in 22.22% of drinking water studied. HQs have an average of less than 1 for fluorine and greater than 1 for all metals. Moreover, the danger indices have values greater than 1. The incremental lifetime cancer risk (ILCR) and the total ILCR are above the recommended values. These results showed that the drinking water sampled is of poor quality due to higher levels of heavy metals, which can constitute a danger to human health. Long-term use of one of these poor quality waters can lead to cancer in consumers. It is therefore necessary to treat this water in order to eliminate the metals before using it for drinking. This study can help decision-makers and competent authorities in charge of water management.
文摘We propose a new clustering algorithm that assists the researchers to quickly and accurately analyze data. We call this algorithm Combined Density-based and Constraint-based Algorithm (CDC). CDC consists of two phases. In the first phase, CDC employs the idea of density-based clustering algorithm to split the original data into a number of fragmented clusters. At the same time, CDC cuts off the noises and outliers. In the second phase, CDC employs the concept of K-means clustering algorithm to select a greater cluster to be the center. Then, the greater cluster merges some smaller clusters which satisfy some constraint rules. Due to the merged clusters around the center cluster, the clustering results show high accuracy. Moreover, CDC reduces the calculations and speeds up the clustering process. In this paper, the accuracy of CDC is evaluated and compared with those of K-means, hierarchical clustering, and the genetic clustering algorithm (GCA) proposed in 2004. Experimental results show that CDC has better performance.
文摘In digital fingerprinting, preventing piracy of images by colluders is an important and tedious issue. Each image will be embedded with a unique User IDentification (UID) code that is the fingerprint for tracking the authorized user. The proposed hiding scheme makes use of a random number generator to scramble two copies of a UID, which will then be hidden in the randomly selected medium frequency coefficients of the host image. The linear support vector machine (SVM) will be used to train classifications by calculating the normalized correlation (NC) for the 2class UID codes. The trained classifications will be the models used for identifying unreadable UID codes. Experimental results showed that the success of predicting the unreadable UID codes can be increased by applying SVM. The proposed scheme can be used to provide protections to intellectual property rights of digital images aad to keep track of users to prevent collaborative piracies.
基金This work is supported by the National Key R&D Program of China under grant 2018YFB1003205by the National Natural Science Foundation of China under grant 61502242,U1536206,U1405254,61772283,61602253,61672294+2 种基金by the Jiangsu Basic Research Programs-Natural Science Foundation under grant numbers BK20150925 and BK20151530by the Priority Academic Program Development of Jiangsu Higher Education Institutions(PAPD)fundby the Collaborative Innovation Center of Atmospheric Environment and Equipment Technology(CICAEET)fund,China.
文摘Recently,a reversible image transformation(RIT)technology that transforms a secret image to a freely-selected target image is proposed.It not only can generate a stego-image that looks similar to the target image,but also can recover the secret image without any loss.It also has been proved to be very useful in image content protection and reversible data hiding in encrypted images.However,the standard deviation(SD)is selected as the only feature during the matching of the secret and target image blocks in RIT methods,the matching result is not so good and needs to be further improved since the distributions of SDs of the two images may be not very similar.Therefore,this paper proposes a Gray level co-occurrence matrix(GLCM)based approach for reversible image transformation,in which,an effective feature extraction algorithm is utilized to increase the accuracy of blocks matching for improving the visual quality of transformed image,while the auxiliary information,which is utilized to record the transformation parameters,is not increased.Thus,the visual quality of the stego-image should be improved.Experimental results also show that the root mean square of stego-image can be reduced by 4.24%compared with the previous method.
基金Supported by the Major Program of National Natural Science Foundation of China(No.61432006)。
文摘Performing analytics on the load curve(LC)of customers is the foundation for demand response which requires a better understanding of customers'consumption pattern(CP)by analyzing the load curve.However,the performances of previous widely-used LC clustering methods are poor in two folds:larger number of clusters,huge variances within a cluster(a CP is extracted from a cluster),bringing huge difficulty to understand the electricity consumption pattern of customers.In this paper,to improve the performance of LC clustering,a clustering framework incorporated with community detection is proposed.The framework includes three parts:network construction,community detection,and CP extraction.According to the cluster validity index(CVI),the integrated approach outperforms the previous state-of-the-art method with the same amount of clusters.And the approach needs fewer clusters to achieve the same performance measured by CVI.
基金Sponsored by the National Natural Science Foundation of China (Grant No. 60776816)Scientific Research Foundation of Education Department of Yunnan Province (Grant No.08Y10326)
文摘Aiming at the tele-operation instability caused by time delay of interuet information transfer for internet based tele-robotics, this paper proposes a novel control framework for internet based tele-roboties, which can guarantee the non-distortion-transfer of control information and reduce the difference of action time between the local simulated virtual robot and the remote real robot. This framework is insensitive to the inherent interact time delay, and differs from other tele-robotics systems that try to use some mathematic models to describe the internet delay or take some assumptions. In order to verify the framework, a 4-DOF fischertechnik industry robot tele-operation system has been developed using the new proposed framework. Experimental results demonstrate the applicable performance of the new framework. The framework is open structured and can be applied to other general purposed tele-operation systems.
基金supported by the Program for New Century Excellent Talents in University of ChinaNational Natural Science Foundation of China (Grant No. 11071224)
文摘Suppose compact sets E and F are quasi uniformly disconnected and quasi Ahlfors-David regular.This paper proves that E and F are quasi-Lipschitz equivalent if and only if they have the same Hausdorff dimension.
基金supported by the Innovation Fund of Chinese Academy of Sciences (Grant No. YZ200904)
文摘A new device used to detect the low contrast target acquisition of photoelectric theodolite is designed and its reliability is experimentally demonstrated.The adjustable contrast optical target device,which can simulate the sky background luminance and a low contrast target,is established.It utilizes a big integrating sphere and a small one to simulate the luminance of the background and target respectively.Importantly,by controlling the luminous flux of the two integrating spheres,the targets and background radiance can be continuously adjustable under the condition of constant color temperature.Thus,the contrast can be controlled continuously in the range of 0%-90% and its stability is better than 1%.The biggest background luminance exceeds 60 W m-2str-1 in the spectral range of 400-800 nm.
基金supported by the National Natural Sci-ence Foundation of China(Grant Nos.U1636208,NO 61862008).
文摘As cloud computing technology turning to mature,cloud services have become a trust-based service.Users'distrust of the security and performance of cloud services will hinder the rapid deployment and development of cloud services.So cloud service providers(CSPs)urgently need a way to prove that the infrastructure and the behavior of cloud services they provided can be trusted.The challenge here is how to construct a novel framework that can effective verify the security conformance of cloud services,which focuses on fine-grained descriptions of cloud service behavior and security service level aggreements(SLAs).In this paper,we propose a novel approach to verify cloud service security conformance,which reduces the description gap between the CSP and users through modeling cloud service behavior and security SLA,these models enable a systematic integration of security constraints and service behavior into cloud while using UPPAAL to check the performance and security conformance.The proposed approach is validated through case study and experiments with real cloud service based on Open-Stack,which illustrates CloudSec approach effectiveness and can be applied on realistic cloud scenario.
基金This research was funded by the Zhejiang’JIANBING’R&D Project(No.2022C01055)Zhejiang Provincial Department of Transport Technology Project(No.2024011).
文摘To improve the efficiency of evolutionary algorithms(EAs)for solving complex problems with large populations,this paper proposes a scalable parallel evolution optimization(SPEO)framework with an elastic asynchronous migration(EAM)mechanism.SPEO addresses two main challenges that arise in large-scale parallel EAs:(1)heavy communication workload from extensive information exchange across numerous processors,which reduces computational efficiency,and(2)loss of population diversity due to similar solutions generated and shared by many processors.The EAM mechanism introduces a self-adaptive communication scheme to mitigate communication overhead,while a diversity-preserving buffer helps maintain diversity by filtering similar solutions.Experimental results on eight CEC2014 benchmark functions using up to 512 CPU cores on the Australian National Computational Infrastructure(NCI)platform demonstrate that SPEO not only scales efficiently with an increasing number of processors but also achieves improved solution quality compared to state-of-the-art island-based EAs.
文摘Nearest Neighbor (κNN) search is one of the most important operations in spatial and spatio-temporal databases. Although it has received considerable attention in the database literature, there is little prior work on κNN retrieval for moving object trajectories. Motivated by this observation, this paper studies the problem of efficiently processing κNN (κ≥ 1) search on R-tree-like structures storing historical information about moving object trajectories. Two algorithms are developed based on best-first traversal paradigm, called BFPκNN and BFTκNN, which handle the κNN retrieval with respect to the static query point and the moving query trajectory, respectively. Both algorithms minimize the number of node access, that is, they perform a single access only to those qualifying nodes that may contain the final result. Aiming at saving main-memory consumption and reducing CPU cost further, several effective pruning heuristics are also presented. Extensive experiments with synthetic and real datasets confirm that the proposed algorithms in this paper outperform their competitors significantly in both efficiency and scalability.
基金supported by the National Natural Science Foundation of China(Grant Nos.61806050,61972063,61976050)the Fundamental Research Funds for the Central Universities(2412020FZ030,2412019ZD013,2412019FZ051)Jilin Science and Technology Association(QT202005).
文摘The minimum independent dominance set(MIDS)problem is an important version of the dominating set with some other applications.In this work,we present an improved master-apprentice evolutionary algorithm for solving the MIDS problem based on a path-breaking strategy called MAE-PB.The proposed MAE-PB algorithm combines a construction function for the initial solution generation and candidate solution restarting.It is a multiple neighborhood-based local search algorithm that improves the quality of the solution using a path-breaking strategy for solution recombination based on master and apprentice solutions and a perturbation strategy for disturbing the solution when the algorithm cannot improve the solution quality within a certain number of steps.We show the competitiveness of the MAE-PB algorithm by presenting the computational results on classical benchmarks from the literature and a suite of massive graphs from real-world applications.The results show that the MAE-PB algorithm achieves high performance.In particular,for the classical benchmarks,the MAE-PB algorithm obtains the best-known results for seven instances,whereas for several massive graphs,it improves the best-known results for 62 instances.We investigate the proposed key ingredients to determine their impact on the performance of the proposed algorithm.
基金This work was supported by the National High Technology Research and Development 863 Program of China under Grant No. 2013AA01A213 and the National Natural Science Foundation of China under Grant Nos. 91318301, 61373011 and 61321491.
文摘Mobile phone localization plays a key role in the fast-growing location-based applications domain. Most of the existing localization schemes rely on infrastructure support such as GSM, Wi-Fi or GPS. In this paper, we present FTrack, a novel floor localization system to identify the floor level in a multi-floor building on which a mobile user is located. FTrack uses the mobile phone's sensors only without any infrastructure support. It does not require any prior knowledge of the building such as floor height or floor levels. Through crowdsourcing, FTrack builds a mapping table which contains the magnetic field signature of users taking the elevator/escalator or walking on the stairs between any two floors. The table can then be used for mobile users to pinpoint their current floor levels. We conduct both simulation and field studies to demonstrate the eiTiciency, scalability and robustness of FTrack. Our field trial shows that FTrack achieves an accuracy of over 96% in three different buildings.
基金This work was supported by the Natural Science Foundation of Guangdong Province, China (2015A030310509), the National Natural Science Foundation of China (Grant Nos. 61370229, 61272067, 61303049), the S&T Planning Key Projects of Guangdong Province (2014B010117007, 2015B010109003, 2015A030401087, 2016A030303055, 2016B030305004, and 2016B010109008) and the S&T Projects of Guangzhou Municipality, China (201604010003).
文摘Online prediction is a process that repeatedly predicts the next element in the coming period from a sequence of given previous elements. This process has a broad range of applications in various areas, such as medical, streaming media, and finance. The greatest challenge for online prediction is that the sequence data may not have explicit features because the data is frequently updated, which means good predictions are difficult to maintain. One of the popular solutions is to make the prediction with expert advice, and the challenge is to pick the right experts with minimum cumulative loss. In this research, we use the forex trading prediction, which is a good example for online prediction, as a case study. We also propose an improved expert selection model to select a good set of forex experts by learning previously observed sequences. Our model considers not only the average mistakes made by experts, but also the average profit earned by experts, to achieve a better performance, particularly in terms of financial profit. We demonstrate the merits of our model on two real major currency pairs corpora with extensive experiments.
基金supported by the National Natural Science Foundation of China under Grant Nos.61373011,91318301,and 61321491
文摘Users are vulnerable to privacy risks when providing their location information to location-based services (LBS). Existing work sacrifices the quality of LBS by degrading spatial and temporal accuracy for ensuring user privacy. In this paper, we propose a novel approach, Complete Bipartite Anonymity (CBA), aiming to achieve both user privacy and quality of service. The theoretical basis of CBA is that: if the bipartite graph of k nearby users' paths can be transformed into a complete bipartite graph, then these users achieve k-anonymity since the set of "end points connecting to a specific start point in a graph" is an equivalence class. To achieve CBA, we design a Collaborative Path Confusion (CPC) protocol which enables nearby nsers to discover and authenticate each other without knowing their real identities or accurate locations, predict tile encounter location using users' moving pattern information, and generate fake traces obfuscating the real ones. We evaluate CBA using a real-world dataset, and compare its privacy performance with existing path confusion approach. The results show that CBA enhances location privacy by increasing the chance for a user confusing his/her path with others by 4 to 16 times in low user density areas. We also demonstrate that CBA is secure under the trace identification attack.
文摘The maximal matching problem (MMP) is to find maximal edge subsets in a given undirected graph, that no pair of edges are adjacent in the subsets. It is a vitally important NP-complete problem in graph theory and applied mathematics, having numerous real life applications in optimal combination and linear programming fields. It can be difficultly solved by the electronic computer in exponential level time. Meanwhile in previous studies deoxyribonucleic acid (DNA) molecular operations usually were used to solve NP-complete continuous path search problems, e.g. HPP, traveling salesman problem, rarely for NP-hard problems with discrete vertices or edges solutions, such as the minimum vertex cover problem, graph coloring problem and so on. In this paper, we present a DNA algorithm for solving the MMP with DNA molecular operations. For an undirected graph with n vertices and m edges, we reasonably design fixed length DNA strands representing vertices and edges of the graph, take appropriate steps and get the solutions of the MMP in proper length range using O(n^3) time. We extend the application of DNA molecular operations and simultaneously simplify the complexity of the computation.