Change search
Refine search result
2345678 201 - 250 of 793
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 201.
    Baqer Mollah, Muhammad
    et al.
    Department of Computer Science and Engineering, Jahangirnagar University, Dhaka.
    Kalam Azad, Md. Abul
    Department of Computer Science and Engineering, Jahangirnagar University, Dhaka.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Security and privacy challenges in mobile cloud computing: Survey and way ahead2017In: Journal of Network and Computer Applications, ISSN 1084-8045, E-ISSN 1095-8592, Vol. 84, p. 38-54Article in journal (Refereed)
    Abstract [en]

    The rapid growth of mobile computing is seriously challenged by the resource constrained mobile devices. However, the growth of mobile computing can be enhanced by integrating mobile computing into cloud computing, and hence a new paradigm of computing called mobile cloud computing emerges. In here, the data is stored in cloud infrastructure and the actual execution is shifted to cloud environment so that a mobile user is set free from resource constrained issue of existing mobile devices. Moreover, to avail the cloud services, the communications between mobile devices and clouds are held through wireless medium. Thus, some new classes of security and privacy challenges are introduced. The purpose of this survey is to present the main security and privacy challenges in this field which have grown much interest among the academia and research community. Although, there are many challenges, corresponding security solutions have been proposed and identified in literature by many researchers to counter the challenges. We also present these recent works in short. Furthermore, we compare these works based on different security and privacy requirements, and finally present open issues.

  • 202.
    Kim, Dohyung
    et al.
    Sungkyunkwan University, Suwon, South Korea.
    Bi, Jun
    Tsinghua University, Beijing .
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Yeom, Ikjun
    Sungkyunkwan University, Suwon .
    Security of Cached Content in NDN2017In: IEEE Transactions on Information Forensics and Security, ISSN 1556-6013, E-ISSN 1556-6021, Vol. 12, no 12, p. 2933-2944Article in journal (Refereed)
    Abstract [en]

    In Named-Data Networking (NDN), content is cached in network nodes and served for future requests. This property of NDN allows attackers to inject poisoned content into the network and isolate users from valid content sources. Since a digital signature is embedded in every piece of content in NDN architecture, poisoned content is discarded if routers perform signature verification; however, if every content is verified by every router, it would be overly expensive to do. In our preliminary work, we have suggested a content verification scheme that minimizes unnecessary verification and favors already verified content in the content store, which reduces the verification overhead by as much as 90% without failing to detect every piece of poisoned content. Under this scheme, however, routers are vulnerable to verification attack, in which a large amount of unverified content is accessed to exhaust system resources. In this paper, we carefully look at the possible concerns of our preliminary work, including verification attack, and present a simple but effective solution. The proposed solution mitigates the weakness of our preliminary work and allows this paper to be deployed for real-world applications.

  • 203.
    Abedin, Md. Zainal
    et al.
    University of Science and Technology, Chittagong.
    Paul, Sukanta
    University of Science and Technology, Chittagong.
    Akhter, Sharmin
    University of Science and Technology, Chittagong.
    Siddiquee, Kazy Noor E Alam
    University of Science and Technology, Chittagong.
    Hossain, Mohammad Shahadat
    University of Chittagong, Bangladesh.
    Andersson, Karl
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Selection of Energy Efficient Routing Protocol for Irrigation Enabled by Wireless Sensor Networks2017In: Proceedings of 2017 IEEE 42nd Conference on Local Computer Networks Workshops, Piscataway, NJ: Institute of Electrical and Electronics Engineers (IEEE), 2017, p. 75-81Conference paper (Refereed)
    Abstract [en]

    Wireless Sensor Networks (WSNs) are playing remarkable contribution in real time decision making by actuating the surroundings of environment. As a consequence, the contemporary agriculture is now using WSNs technology for better crop production, such as irrigation scheduling based on moisture level data sensed by the sensors. Since WSNs are deployed in constraints environments, the life time of sensors is very crucial for normal operation of the networks. In this regard routing protocol is a prime factor for the prolonged life time of sensors. This research focuses the performances analysis of some clustering based routing protocols to select the best routing protocol. Four algorithms are considered, namely Low Energy Adaptive Clustering Hierarchy (LEACH), Threshold Sensitive Energy Efficient sensor Network (TEEN), Stable Election Protocol (SEP) and Energy Aware Multi Hop Multi Path (EAMMH). The simulation is carried out in Matlab framework by using the mathematical models of those algortihms in heterogeneous environment. The performance metrics which are considered are stability period, network lifetime, number of dead nodes per round, number of cluster heads (CH) per round, throughput and average residual energy of node. The experimental results illustrate that TEEN provides greater stable region and lifetime than the others while SEP ensures more througput.

  • 204.
    Lan, Kun
    et al.
    Department of Computer and Information Science, University of Macau.
    Fong, Simon
    Department of Computer and Information Science, University of Macau.
    Song, Wei
    School of Computer Science, North China University of Technology, Beijing .
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Milham, Richard C.
    Department of Information Technology, Durban University of Technology.
    Self-Adaptive Pre-Processing Methodology for Big Data Stream Mining in Internet of Things Environmental Sensor Monitoring2017In: Symmetry, ISSN 2073-8994, E-ISSN 2073-8994, Vol. 9, no 10, article id 244Article in journal (Refereed)
    Abstract [en]

    Over the years, advanced IT technologies have facilitated the emergence of new ways of generating and gathering data rapidly, continuously, and largely and are associated with a new research and application branch, namely, data stream mining (DSM). Among those multiple scenarios of DSM, the Internet of Things (IoT) plays a significant role, with a typical meaning of a tough and challenging computational case of big data. In this paper, we describe a self-adaptive approach to the pre-processing step of data stream classification. The proposed algorithm allows different divisions with both variable numbers and lengths of sub-windows under a whole sliding window on an input stream, and clustering-based particle swarm optimization (CPSO) is adopted as the main metaheuristic search method to guarantee that its stream segmentations are effective and adaptive to itself. In order to create a more abundant search space, statistical feature extraction (SFX) is applied after variable partitions of the entire sliding window. We validate and test the effort of our algorithm with other temporal methods according to several IoT environmental sensor monitoring datasets. The experiments yield encouraging outcomes, supporting the reality that picking significant appropriate variant sub-window segmentations heuristically with an incorporated clustering technique merit would allow these to perform better than others

  • 205.
    Chude-Okonkwo, U.A.K.
    et al.
    University of Pretoria, South Africa.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Malekian, Reza
    University of Pretoria, South Africa.
    Maharaj, Bodhaswar T.J.
    University of Pretoria, South Africa.
    Simulation analysis of inter-symbol interference in diffusion-based molecular communication with non-absorbing receiver2017In: Proceedings of the 4th ACM International Conference on Nanoscale Computing and Communication, NanoCom 2017, New York: ACM Digital Library, 2017, article id 13Conference paper (Refereed)
    Abstract [en]

    This paper presents the analysis of inter-symbol interference (ISI) in a typical diffusion-based molecular communication system for a non-absorbing molecular receiver with no consideration to any artificially applied ISI mitigation technique. We employ stochastic simulation approach to analyze the influence of varied number of transmitted molecules, and molecules' degradation rates.

  • 206.
    Liao, Xuhong
    et al.
    National Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    He, Yong
    National Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University.
    Small-World Human Brain Networks: Perspectives and Challenges2017In: Neuroscience and Biobehavioral Reviews, ISSN 0149-7634, E-ISSN 1873-7528, Vol. 77, p. 286-300Article in journal (Refereed)
    Abstract [en]

    Modelling the human brain as a complex network has provided a powerful mathematical framework to characterize the structural and functional architectures of the brain. In the past decade, the combination of non-invasive neuroimaging techniques and graph theoretical approaches enable us to map human structural and functional connectivity patterns (i.e., connectome) at the macroscopic level. One of the most influential findings is that human brain networks exhibit prominent small-world organization. Such a network architecture in the human brain facilitates efficient information segregation and integration at low wiring and energy costs, which presumably results from natural selection under the pressure of a cost-efficiency balance. Moreover, the small-world organization undergoes continuous changes during normal development and aging and exhibits dramatic alterations in neurological and psychiatric disorders. In this review, we survey recent advances regarding the small-world architecture in human brain networks and highlight the potential implications and applications in multidisciplinary fields, including cognitive neuroscience, medicine and engineering. Finally, we highlight several challenging issues and areas for future research in this rapidly growing field.

  • 207.
    Runardotter, Mari
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Gylling, Arne
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Distance- Spanning Technology.
    Lindberg, Johanna
    Region Norrbotten.
    Päivärinta, Tero
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Tingström, Johan
    iTid Tarinfo AB.
    Ylinenpää, Roger
    Luleå kommun.
    Smarta hållbara byar i Övre Norrland: Förstudie – nuläge och framtid2017Report (Other (popular science, discussion, etc.))
  • 208.
    Bera, Samaresh
    et al.
    Computer Science and Engineering Department, Indian Institute of Technology, Kharagpur, 721302, India..
    Misra, Sudip
    Computer Science and Engineering Department, Indian Institute of Technology, Kharagpur, 721302, India..
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Software-Defined Networking for Internet of Things: a Survey2017In: IEEE Internet of Things Journal, ISSN 2327-4662, Vol. 4, no 6, p. 1994-2008Article in journal (Refereed)
    Abstract [en]

    Internet of things (IoT) facilitates billions of devices to be enabled with network connectivity to collect and exchange real-time information for providing intelligent services. Thus, IoT allows connected devices to be controlled and accessed remotely in the presence of adequate network infrastructure. Unfortunately, traditional network technologies such as enterprise networks and classic timeout-based transport protocols are not capable of handling such requirements of IoT in an efficient, scalable, seamless, and cost-effective manner. Besides, the advent of software-defined networking (SDN) introduces features that allow the network operators and users to control and access the network devices remotely, while leveraging the global view of the network. In this respect, we provide a comprehensive survey of different SDN-based technologies, which are useful to fulfill the requirements of IoT, from different networking aspects – edge, access, core, and data center networking. In these areas, the utility of SDN-based technologies is discussed, while presenting different challenges and requirements of the same in the context of IoT applications. We present a synthesized overview of the current state of IoT development. We also highlight some of the future research directions and open research issues based on the limitations of the existing SDN-based technologies.

  • 209.
    Ranjan, Rajiv
    et al.
    China University of Geosciences, Wuhan.
    Wang, Lizhe
    China University of Geosciences, Wuhan.
    Prakash Jayaraman, Prem
    Swinburne University of Technology, Melbourne.
    Mitra, Karan
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Georgakopoulos, Dimitros
    Swinburne University of Technology, Melbourne.
    Special issue on Big Data and Cloud of Things (CoT)2017In: Software, practice & experience, ISSN 0038-0644, E-ISSN 1097-024X, Vol. 47, no 3, p. 345-347Article in journal (Refereed)
  • 210.
    Zhang, Yi-Qing
    et al.
    Adaptive Networks and Control Laboratory, Department of Electronic Engineering, and the Center of Smart Networks and Systems, School of Information Science and Engineering, Fudan University.
    Li, Xiang
    Adaptive Networks and Control Laboratory, Department of Electronic Engineering, and the Center of Smart Networks and Systems, School of Information Science and Engineering, Fudan University.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Spectral Analysis of Epidemic Thresholds of Temporal Networks2017In: IEEE Transactions on Cybernetics, ISSN 2168-2267, E-ISSN 2168-2275Article in journal (Refereed)
    Abstract [en]

    Many complex systems can be modeled as temporal networks with time-evolving connections. The influence of their characteristics on epidemic spreading is analyzed in a susceptible-infected-susceptible epidemic model illustrated by the discrete-time Markov chain approach. We develop the analytical epidemic thresholds in terms of the spectral radius of weighted adjacency matrix by averaging temporal networks, e.g., periodic, nonperiodic Markovian networks, and a special nonperiodic non-Markovian network (the link activation network) in time. We discuss the impacts of statistical characteristics, e.g., bursts and duration heterogeneity, as well as time-reversed characteristic on epidemic thresholds. We confirm the tightness of the proposed epidemic thresholds with numerical simulations on seven artificial and empirical temporal networks and show that the epidemic threshold of our theory is more precise than those of previous studies.

  • 211.
    Booth, Todd
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Andersson, Karl
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science. Luleå University of Technology, Centre for Critical Infrastructure and Societal Security.
    Stronger Authentication for Password Credential Internet Services2017In: Proceedings of the 2017 Third Conference on Mobile and Secure Services (MOBISECSERV) / [ed] Pascal Urien, Selwyn Piramuthu, Piscataway, NJ: IEEE conference proceedings, 2017, p. 41-45, article id 7886566Conference paper (Refereed)
    Abstract [en]

    Most Web and other on-line service providers (”Inter- net Services”) only support legacy ID (or email) and password (ID/PW) credential authentication. However, there are numerous vulnerabilities concerning ID/PW credentials. Scholars and the industry have proposed several improved security solutions, such as MFA, however most of the Internet Services have refused to adopt these solutions. Mobile phones are much more sensitive to these vulnerabilities (so this paper focuses on mobile phones). Many users take advantage of password managers, to keep track of all their Internet Service profiles. However, the Internet Service profiles found in password managers, are normally kept on the PC or mobile phone’s disk, in an encrypted form. Our first contribution is a design guideline, whereby the Internet Service profiles never need to touch the client’s disk. Most users would benefit, if they had the ability to use MFA, to login to a legacy Internet Service, which only supports ID/PW credential authentication. Our second contribution is a design guideline, whereby users can choose, for each legacy ID/PW Internet Service, which specific MFA they wish to use. We have also presenting conceptual design guidelines, showing that both of our contributions are minor changes to existing password managers, which can be implemented easily with low overhead.

  • 212.
    Yu, Xixun
    et al.
    The State Key Laboratory on Integrated Services Networks, School of Cyber Engineering, Xidian University, Xi’an.
    Yan, Zheng
    The State Key Laboratory on Integrated Services Networks, School of Cyber Engineering, Xidian University, Xi’an.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Survey of Verifiable Computation2017In: Mobile Networks and Applications, ISSN 1383-469X, Vol. 22, no 3, p. 438-453Article in journal (Refereed)
    Abstract [en]

    Internet of Things (IoT) has taken place to motivate various intelligent applications with the data collected by the things”. Meanwhile, Cloud computing offers an efficient and convenient way to store, process and analyze huge amount of data. Because a Cloud Service Provider (CSP) that is employed to store and process user private data is actually not in the trust domains of cloud users, data security becomes a serious issue in cloud computing. One crucial problem in cloud is the cloud data processing result may be incorrect, thus cannot be fully trusted. This calls for research to verify the correctness of data processing at the cloud in order to enhance its trustworthiness, especially for encrypted data processing. At present, various cryptosystems have been proposed to achieve verifiability with different characteristics and quality. However, the literature still lacks a thorough survey to review the current state of art in order to get a comprehensive view of this research field, named verifiable computation. In this paper, we review existing work of verifiable computation by comparing and discussing pros and cons according to performance requirements, highlight open research issues through serious review and analysis and propose a number of research directions in order to guide future research

  • 213.
    Kamal, Ahmed E.
    et al.
    Department of Electrical & Computer Engineering, Iowa State University, USA.
    Imran, Muhammad
    College of Computer and Information Sciences, King Saud University, Saudi Arabia.
    Chen, Hsiao-Hwa
    Department of Engineering Science, National Cheng Kung University, Taiwan.
    Vasilakos, Athanasios V.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Survivability strategies for emerging wireless networks2017In: Computer Networks, ISSN 1389-1286, E-ISSN 1872-7069, Vol. 128, p. 1-4Article in journal (Refereed)
  • 214.
    Andersson, Karl
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Rondeau, Éric
    University of Lorraine.
    Kor, Ah-Lian
    School of Computing, Creative Technologies and Engineering, Leeds Beckett University.
    Johansson, Dan
    Umea University, Department of Informatics.
    Sustainable Mobile Computing and Communications2017In: Mobile Information Systems, ISSN 1574-017X, Vol. 2017, article id 1098264Article in journal (Refereed)
  • 215.
    Ahmed, Ejaz
    et al.
    The Centre for Mobile Cloud Computing Research, Faculty of Computer Science and Information Technology, University of Malaya.
    Yaqoob, Ibrar
    The Centre for Mobile Cloud Computing Research, Faculty of Computer Science and Information Technology, University of Malaya.
    Hashem, Ibrahim Abaker Targio
    The Centre for Mobile Cloud Computing Research, Faculty of Computer Science and Information Technology, University of Malaya.
    Khan, Imran
    Schneider Electric Industries, Grenoble.
    Ahmed, Abdelmuttlib Ibrahim Abdalla
    The Centre for Mobile Cloud Computing Research, Faculty of Computer Science and Information Technology, University of Malaya.
    Imran, Muhammad
    College of Computer and Information Sciences, King Saud University.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    The role of big data analytics in Internet of Things2017In: Computer Networks, ISSN 1389-1286, E-ISSN 1872-7069, Vol. 129, no 2, p. 459-471Article in journal (Refereed)
    Abstract [en]

    The explosive growth in the number of devices connected to the Internet of Things (IoT) and the exponential increase in data consumption only reflect how the growth of big data perfectly overlaps with that of IoT. The management of big data in a continuously expanding network gives rise to non-trivial concerns regarding data collection efficiency, data processing, analytics, and security. To address these concerns, researchers have examined the challenges associated with the successful deployment of IoT. Despite the large number of studies on big data, analytics, and IoT, the convergence of these areas creates several opportunities for flourishing big data and analytics for IoT systems. In this paper, we explore the recent advances in big data analytics for IoT systems as well as the key requirements for managing big data and for enabling analytics in an IoT environment. We taxonomized the literature based on important parameters. We identify the opportunities resulting from the convergence of big data, analytics, and IoT as well as discuss the role of big data analytics in IoT applications. Finally, several open challenges are presented as future research directions.

  • 216.
    Li, Di
    et al.
    School of Mechanical and Automotive Engineering, South China University of Technology.
    Nan, Zhou
    School of Mechanical and Automotive Engineering, South China University of Technology.
    Jiafu, Wan
    School of Mechanical and Automotive Engineering, South China University of Technology.
    Zhenkun, Zhai
    School of Mechanical and Automotive Engineering, South China University of Technology.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Towards a model-integrated computing paradigm for reconfigurable motion control system2017In: IEEE International Conference on Industrial Informatics (INDIN), Piscataway, NJ: Institute of Electrical and Electronics Engineers (IEEE), 2017, p. 756-761, article id 7819260Conference paper (Refereed)
    Abstract [en]

    To accommodate the trend toward mass customization launched by intelligent manufacturing, the paper proposes the adoption of model-integrated computing (MIC) paradigm in the motion control system development process for enhancing flexibility and robustness. Hierarchical structural and behavioral diversities in motion control system are considered during the implementation of MIC paradigm. For design-phase implementation, a motion-control-domain-specific modeling language is developed, and formal semantics are integrated. With regard to execution-phase implementation, a real-time runtime framework compliant with the IEC 61499 standard is proposed. Extensions of function block chain and priority-based event propagation are proposed. Dynamically extendable FB types library for motion control domain is constructed. A prototype three-axis motion control system is modeled using the proposed modelling language and is then deployed to the implemented framework to prove the feasibility of the adoption of the MIC paradigm in motion control domain

  • 217.
    Visuri, Aku
    et al.
    Center for Ubiquitous Computing, University of Oulu.
    Ferreira, Denzil
    Center for Ubiquitous Computing, University of Oulu.
    Pirttikangas, Susanna
    Center for Ubiquitous Computing, University of Oulu.
    Kotakos, Vassilis
    University of Melbourne.
    Synnes, Kåre
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Lindqvist, Janne K.O.
    Rutgers University.
    Nishiyama, Yuuki
    Keio University.
    UbiMI'17: Ubiquitous mobile instrumentation2017In: UbiComp/ISWC 2017: Adjunct Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers11 September 2017, New York: Association for Computing Machinery (ACM), 2017, p. 448-451Conference paper (Refereed)
    Abstract [en]

    Mobile devices (smartphones, smartwatches, etc.) allow us to reach people anywhere, anytime. Collectively, these devices form a ubiquitous computer that offers valuable insights on the user. In addition to the benefits for researchers and developers, explored in previous UbiMI workshops, devices can also help individuals understand their own health, activities, and behaviour. The Ubiquitous Mobile Instrumentation (UbiMI) workshop focuses on using mobile devices as instruments to collect sensing data, to understand human-behaviour and routines, and to gather users' context using sensor instrumentation

  • 218.
    Synnes, Kåre
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Kranz, Matthias
    University of Passau.
    Rana, Juwel
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Schelén, Olov
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    User-Centric Social Interaction for Digital Cities2017In: The internet of things: breakthroughs in research and practice, IGI Global, 2017, p. 41-70Chapter in book (Refereed)
    Abstract [en]

    Pervasive Computing was envisioned by pioneers like Mark Weiser, but has yet to become an everyday technology in our society. The recent advances regarding Internet of Things, social computing and mobile access technologies however converge to make pervasive computing truly ubiquitous. The key challenge is however to make simple and robust solutions for normal users, which shifts the focus from complex platforms involving machine learning and artificial intelligence to more hands on construction of services that are tailored or personalized for individual users.This chapter therefore discusses Internet of Things together with Social Computing as a basis for components that users in a ’digital city’ could utilize to make their daily life better, safer, etc. A novel environment for user-created services, such as social apps, is presented as a possible solution for this. The vision is that anyone could make simple service based on Internet-enabled devices (Internet of Things) and encapsulated digital resources such as Open Data, which also can have social aspects embedded.This chapter also aims to identify trends, challenges and recommendations in regard of Social Interaction for Digital Cities. This work will help expose future themes with high innovation and business potential based on a timeframe roughly 15 years ahead of now. The purpose is to create a common outlook on the future of information and communication technologies (ICT) based on the extrapolation of current trends and ongoing research efforts.

  • 219.
    Lin, Di
    et al.
    School of Information and Software Engineering, University of Electronic Science and Technology of China.
    Tang, Yu
    School of Information and Software Engineering, University of Electronic Science and Technology of China.
    Yao, Yuanzhe
    School of Information and Software Engineering, University of Electronic Science and Technology of China.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    User-priority based power control over the D2D assisted Internet of vehicles for mobile health2017In: IEEE Internet of Things Journal, ISSN 2327-4662, Vol. 4, no 3, p. 824-831, article id 7858717Article in journal (Refereed)
    Abstract [en]

    A device-to-device (D2D) assisted cellular network is pervasive to support ubiquitous healthcare applications, since it is expected to bring the significant benefits of improving user throughput, extending the battery life of mobiles, etc. However, D2D and cellular communications in the same network may cause cross-tier interference (CTI) to each other. Also a critical issue of using D2D assisted cellular networks under a healthcare scenario is the electromagnetic interference (EMI) caused by RF transmission, and a high level of EMI may lead to a critical malfunction of medical equipments. In consideration of CTI and EMI, we study the problem of optimizing individual channel rates of the mobile users in different priorities (different levels of emergency) within the Internet of vehicles for mobile health, and propose an algorithm of controlling the transmit power to solve the above-mentioned problem under a gametheoretical framework. Numerical results show that the proposed algorithm can converge linearly to the optimum, while ensuring an allowable level of EMI on medical equipments.

  • 220.
    Al-Turjman, Fadi M.
    et al.
    Department of Computer Engineering, Middle East Technical University, Northern Cyprus Campus.
    Imran, Muhammad
    College of Computer and Information Sciences, King Saud University.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Value-Based Caching in Information-Centric Wireless Body Area Networks2017In: Sensors, ISSN 1424-8220, E-ISSN 1424-8220, Vol. 17, no 1, article id 181Article in journal (Refereed)
    Abstract [en]

    We propose a resilient cache replacement approach based on a Value of sensed Information (VoI) policy. To resolve and fetch content when the origin is not available due to isolated in-network nodes (fragmentation) and harsh operational conditions, we exploit a content caching approach. Our approach depends on four functional parameters in sensory Wireless Body Area Networks (WBANs). These four parameters are: age of data based on periodic request, popularity of on-demand requests, communication interference cost, and the duration for which the sensor node is required to operate in active mode to capture the sensed readings. These parameters are considered together to assign a value to the cached data to retain the most valuable information in the cache for prolonged time periods. The higher the value, the longer the duration for which the data will be retained in the cache. This caching strategy provides significant availability for most valuable and difficult to retrieve data in the WBANs. Extensive simulations are performed to compare the proposed scheme against other significant caching schemes in the literature while varying critical aspects in WBANs (e.g., data popularity, cache size, publisher load, connectivity-degree, and severe probabilities of node failures). These simulation results indicate that the proposed VoI-based approach is a valid tool for the retrieval of cached content in disruptive and challenging scenarios, such as the one experienced in WBANs, since it allows the retrieval of content for a long period even while experiencing severe in-network node failures.

  • 221.
    Zhang, Qingke
    et al.
    School of Computer Science and technology, Engineering Research Center of Digital Media Technology, Ministry of Education, Shandong University, Jinan.
    Liu, Weiguo
    School of Computer Science and technology, Engineering Research Center of Digital Media Technology, Ministry of Education, Shandong University, Jinan.
    Men, Xiangxu
    School of Computer Science and technology, Engineering Research Center of Digital Media Technology, Ministry of Education, Shandong University, Jinan.
    Jiang, Bo
    Shandong Provincial Key Laboratory of Network based Intelligent Computing, University of Jinan.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Vector coevolving particle swarm optimization algorithm2017In: Information Sciences, ISSN 0020-0255, E-ISSN 1872-6291, Vol. 394-395, p. 273-298Article in journal (Refereed)
    Abstract [en]

    In this paper, we propose a novel vector coevolving particle swarm optimization algorithm (VCPSO). In VCPSO, the full dimension of each particle is first randomly partitioned into several sub-dimensions. Then, we randomly assign either one of our newly designed scalar operators or learning operators to update the values in each sub-dimension. The scalar operators are designed to enhance the population diversity and avoid premature convergence. In addition, the learning operators are designed to enhance the global and local search ability. The proposed algorithm is compared with several other classical swarm optimizers on thirty-three benchmark functions. Comprehensive experimental results show that VCPSO displays a better or comparable performance compared to the other algorithms in terms of solution accuracy and statistical results.

  • 222.
    Islam, Mohammad A.
    et al.
    Florida International University, Miami.
    Ren, Shaolei
    University of California, Riverside, CA.
    Quan, Gang
    Florida International University, Miami.
    Shakir, Muhammad Zeeshan
    Texas A&M University, Qatar.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Water-Constrained Geographic Load Balancing in Data Centers2017In: IEEE Transactions on Cloud Computing, ISSN 2168-7161, Vol. 5, no 2, p. 208-220, article id 7152842Article in journal (Refereed)
    Abstract [en]

    Spreading across many parts of the world and presently hard striking California, extended droughts could even potentially threaten reliable electricity production and local water supplies, both of which are critical for data center operation. While numerous efforts have been dedicated to reducing data centers’ energy consumption, the enormity of data centers’ water footprints is largely neglected and, if still left unchecked, may handicap service availability during droughts. In this paper, we propose a water-aware workload management algorithm, called WATCH (WATer-constrained workload sCHeduling in data centers), which caps data centers’ long-term water consumption by exploiting spatio-temporal diversities of water efficiency and dynamically dispatching workloads among distributed data centers. We demonstrate the effectiveness of WATCH both analytically and empirically using simulations: based on only online information, WATCH can result in a provably-low operational cost while successfully capping water consumption under a desired level. Our results also show that WATCH can cut water consumption by 20 percent while only incurring a negligible cost increase even compared to state-of-the-art cost-minimizing but water-oblivious solution. Sensitivity studies are conducted to validate WATCH under various settings.

  • 223.
    Cai, Hongming
    et al.
    School of Software, Shanghai Jiao Tong University.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Web of Things Data Storage2017In: Managing the Web of Things: Linking the Real World to the Web, Elsevier, 2017, p. 325-354Chapter in book (Refereed)
    Abstract [en]

    With the wide spread of Web of Things (WoT) technology, massive data are generated by huge amounts of distributed sensors and different applications. WoT related applications have emerged as an important area for both engineers and researchers. As a consequence, how to acquire, integrate, store, process and use these data has become an urgent and important problem for enterprises to achieve their business goals. Based on data processing functional analysis, a framework is provided to identify the representation, management, and disposing areas of WoT data. Several associated functional modules are defined and described in terms of their key characteristics and capabilities. Then, current researches in WoT applications are organized and compared to show the state-of-the-art achievements in literature from the view of data processing process. Next, some WoT storage techniques are discussed to enable WoT applications to move into cloud platforms. Lastly, based on application requirement analysis, some future technical tendencies are also proposed.

  • 224.
    Islam, Raihan Ul
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Wireless Sensor Network Based Flood Prediction Using Belief Rule Based Expert System2017Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Flood is one of the most devastating natural disasters. It is estimated that flooding from sea level rise will cause one trillion USD to major coastal cities of the world by the year 2050. Flood not only destroys the economy, but it also creates physical and psychological sufferings for the human and destroys infrastructures. Disseminating flood warnings and evacuating people from the flood-affected areas help to save human life. Therefore, predicting flood will help government authorities to take necessary actions to evacuate humans and arrange relief for the people.

    This licentiate thesis focuses on four different aspects of flood prediction using wireless sensor networks (WSNs). Firstly, different WSNs, protocols related to WSN, and backhaul connectivity in the context of predicting flood were investigated. A heterogeneous WSN network for flood prediction was proposed.

    Secondly, data coming from sensors contain anomaly due to different types of uncertainty, which hampers the accuracy of flood prediction. Therefore, anomalous data needs to be filtered out. A novel algorithm based on belief rule base for detecting the anomaly from sensor data has been proposed in this thesis.

    Thirdly, predicting flood is a challenging task as it involves multi-level factors, which cannot be measured with 100% certainty. Belief rule based expert systems (BRBESs) can be considered to handle the complex problem of this nature as they address different types of uncertainty. A web based BRBES was developed for predicting flood. This system provides better usability, more computational power to handle larger numbers of rule bases and scalability by porting it into a web-based solution. To improve the accuracy of flood prediction, a learning mechanism for multi-level BRBES was proposed. Furthermore, a comparison between the proposed multi-level belief rule based learning algorithm and other machine learning techniques including Artificial Neural Networks (ANN), Support Vector Machine (SVM) based regression, and Linear Regression has been performed.

    In the light of the research findings of this thesis, it can be argued that flood prediction can be accomplished more accurately by integrating WSN and BRBES.

  • 225.
    Wang, Yufeng
    et al.
    Nanjing University of Posts and Telecommunications.
    Dai, Wei
    Nanjing University of Posts and Telecommunications.
    Zhang, Bo
    Nanjing University of Posts and Telecommunications.
    Ma, Jianhua
    Hosei University.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Word of Mouth Mobile Crowdsourcing: Increasing Awareness of Physical, Cyber, and Social Interactions2017In: IEEE MultiMedia, E-ISSN 1070-986X, Vol. 24, no 4, p. 26-37Article in journal (Refereed)
    Abstract [en]

    By fully exploring various sensing capabilities and multiple wireless interfaces of mobile devices and integrating them with human power and intelligence, mobile crowdsourcing (MCS) is emerging as an effective paradigm for large-scale multimedia-related applications. However, most MCS schemes use a direct mode, in which crowdworkers passively or actively select tasks and contribute without interacting and collaborating with each other; such a mode can hamper some time-constrained crowdsourced tasks. This article explores a different approach: MCS based on word of mouth (WoM), in which crowdworkers, apart from executing tasks, exploit their mobile social networks and/or physical encounters to actively recruit other appropriate individuals to work on the task. The authors describe a WoM-based MCS architecture and typical applications, which they divide into Internet-scale and local scale. They then systematically summarize the main technical challenges, including crowdworker recruitment, incentive design, security and privacy, and data quality control, and they compare typical solutions. Finally, from a systems-level viewpoint, they discuss several practical issues that must be resolved. This article is part of a special issue on cybersecurity.

  • 226.
    Al-Dulaimi, Anwer
    et al.
    ECE, University of Toronto.
    Anpalagan, Alagan
    WINCORE Lab, Ryerson University, Toronto.
    Bennis, Mehdi
    University of Oulu, Centre for Wireless Communications, University of Oulu.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    5G Green Communications: C-RAN Provisioning of CoMP and Femtocells for Power Management2016In: 2015 IEEE International Conference on Ubiquitous Wireless Broadband (ICUWB): Montreal, Canada, 4-7 October 2015, Piscataway, NJ: IEEE Communications Society, 2016, article id 7324392Conference paper (Refereed)
    Abstract [en]

    The fifth generation (5G) wireless network is expected to have dense deployments of cells in order to provide efficient Internet and cellular connections. The cloud radio access network (C-RAN) emerges as one of the 5G solutions to steer the network architecture and control resources beyond the legacy radio access technologies. The C-RAN decouples the traffic management operations from the radio access technologies leading to a new combination of virtualized network core and fronthaul architecture. In this paper, we first investigate the power consumption impact due to the aggressive deployments of low-power neighborhood femtocell networks (NFNs) under the umbrella of a coordinated multipoint (CoMP) macrocell. We show that power savings obtained from employing low power NFN start to decline as the density of deployed femtocells exceed certain threshold. The analysis considers two CoMP sites at the cell-edge and intra-cell areas. Second, to restore the power efficiency and network stabilization, a C-RAN model is proposed to restructure the NFN into clusters to ease the energy burden in the evolving 5G systems. Tailoring this to traffic load, selected clusters will be switched off to save power when they operate with low traffic loads

  • 227.
    Karim, Razuan
    et al.
    Department of Computer Science and Engineering , University of Science and Technology Chittagong.
    Andersson, Karl
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science. Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Distance- Spanning Technology.
    Hossain, Mohammad Shahadat
    Department of Computer Science and Engineering , University of Chittagong.
    Uddin, Md. Jasim
    Department of Computer Science and Engineering , University of Science and Technology Chittagong.
    Meah, Md. Perveg
    Department of Computer Science and Engineering , University of Science and Technology Chittagong.
    A Belief Rule Based Expert System to Assess Clinical Bronchopneumonia Suspicion2016In: Proceedings of Future Technologies Conference 2016 (FTC 2016) / [ed] Flavio Villanustre and Arjuna Chala, IEEE, 2016, p. 655-660Conference paper (Refereed)
    Abstract [en]

    Bronchopneumonia is an acute or chronic inflammation of the lungs, in which the alveoli and/or interstitial are affected. Usually the diagnosis of Bronchopneumonia is carried out using signs and symptoms of this disease, which cannot be measured since they consist of various types of uncertainty. Consequently, traditional disease diagnosis, which is performed by a physician, cannot deliver accurate results. Therefore, this paper presents the design, development and application of an expert system for assessing the suspicion of Bronchopneumonia under uncertainty. The Belief Rule-Based Inference Methodology using the Evidential Reasoning (RIMER) approach was adopted to develop this expert system, which is named the Belief Rule-Based Expert System (BRBES). The system can handle various types of uncertainty in knowledge representation and inference procedures. The knowledge base of this system was constructed by using real patient data and expert opinion. Practical case studies were used to validate the system. The system-generated results are more effective and reliable in terms of accuracy than from the results generated by a manual system.

  • 228.
    Wu, Qihui
    et al.
    College of Communications Engineering PLA University of Science and Technology, Nanjing.
    Ding, Guoro
    College of Communications Engineering PLA University of Science and Technology, Nanjing.
    Du, Zhiyong
    PLA Academy of National Defense Information, Wuhan.
    Sun, Youming
    National Digital Switching System Engineering & Technological Research Center, Zhengzhou.
    Jo, Minho
    Department of Computer and Information Science, Korea University.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    A Cloud-Based Architecture for the Internet of Spectrum Devices (IoSD) over Future Wireless Networks2016In: IEEE Access, E-ISSN 2169-3536, Vol. 4, p. 2854-2862Article in journal (Refereed)
    Abstract [en]

    The dramatic increase in data rates in wireless networks has caused radio spectrum usage to be an essential and critical issue. Spectrum sharing is widely recognized as an affordable, near-term method to address this issue. This article first characterizes the new features of spectrum sharing in future wireless networks, including heterogeneity in sharing bands, diversity in sharing patterns, crowd intelligence in sharing devices, and hyper-densification in sharing networks. Then, to harness the benefits of these unique features and promote a vision of spectrum without bounds and networks without borders, this article introduces a new concept of the Internet of Spectrum Devices (IoSD) and develops a cloud-based architecture for IoSD over future wireless networks, with the prime aim of building a bridging network among various spectrum monitoring devices (SMDs) and massive spectrum utilization devices (SUDs), and enabling a highly-efficient spectrum sharing and management paradigm for future wireless networks. Furthermore, this article presents a systematic tutorial on the key enabling techniques of the IoSD, including big spectrum data analytics, hierarchal spectrum resource optimization, and quality of experience (QoE)- oriented spectrum service evaluation. In addition, the unresolved research issues are also presented.

  • 229.
    Mohd, Bassam Jamil
    et al.
    Computer Engineering Department, Hashemite University.
    Hayajneh, Thaier
    School of Engineering and Computing Sciences, New York Institute of Technology.
    Khalaf, Zaid Abu
    School of Engineering and Computing Sciences, New York Institute of Technology.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    A comparative study of steganography designs based on multiple FPGA platforms2016In: International Journal of Electronic Security and Digital Forensics, ISSN 1751-911X, E-ISSN 1751-9128, Vol. 8, no 2, p. 164-190Article in journal (Refereed)
    Abstract [en]

    Steganography methods conceal covert messages inside communicated data. Field-programmable gate array (FPGA) hardware implementation provides speed, flexibility and configurability. It is extremely difficult to compare published results from different platforms and technologies. The goal of our research work is to mitigate the dependency by examining implementations from multiple FPGA platforms. The research studies the implementations of 12 spatial steganography methods using Altera and Xilinx FPGAs. The methods include mix-bit LSB, least significant bit (LSB), random LSB and texture-based algorithms. The objective of the research is to develop platform-independent resources, timing, power and energy models; to empower future steganography research. Further, the article evaluates steganography methods using typical performance metrics as well as a novel performance metric. The results suggest that the mix-bit methods exhibit good performance across most of the metrics. However, when image quality is a concern, the two-bit LSB is the front runner

  • 230.
    Karvonen, Niklas
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Kikhia, Basel
    Jimenez, Lara Lorna
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Gomez Simon, Miguel
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Hallberg, Josef
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    A Computationally Inexpensive Classifier Merging Cellular Automata and MCP-Neurons2016In: Ubiquitous Computing and Ambient Intelligence: 10th International Conference, UCAmI 2016, San Bartolomé de Tirajana, Gran Canaria, Spain, November 29 – December 2, 2016, Part II / [ed] Carmelo R. García, Pino Caballero-Gil, Mike Burmester, Alexis Quesada-Arencibia, Springer, 2016, Vol. 2, p. 368-379Conference paper (Refereed)
    Abstract [en]

    There is an increasing need for personalised and context-aware services in our everyday lives and we rely on mobile and wearable devices to provide such services. Context-aware applications often make use of machine-learning algorithms, but many of these are too complex or resource-consuming for implementation on some devices that are common in pervasive and mobile computing. The algorithm presented in this paper, named CAMP, has been developed to obtain a classifier that is suitable for resource-constrained devices such as FPGA:s, ASIC:s or microcontrollers. The algorithm uses a combination of the McCulloch-Pitts neuron model and Cellular Automata in order to produce a computationally inexpensive classifier with a small memory footprint. The algorithm consists of a sparse binary neural network where neurons are updated using a Cellular Automata rule as the activation function. Output of the classifier is depending on the selected rule and the interconnections between the neurons. Since solving the input-output mapping mathematically can not be performed using traditional optimization algorithms, the classifier is trained using a genetic algorithm. The results of the study show that CAMP, despite its minimalistic structure, has a comparable accuracy to that of more advanced algorithms for the datasets tested containing few classes, while performing poorly on the datasets with a higher amount of classes. CAMP could thus be a viable choice for solving classification problems in environments with extreme demands on low resource consumption

  • 231.
    Tong, Guoxiang
    et al.
    School of Optical-Electrical and Computer Engineering, University of Shanghai for Science and Technology.
    Wu, Guanning
    School of Optical-Electrical and Computer Engineering, University of Shanghai for Science and Technology.
    Tan, Jian
    School of Optical-Electrical and Computer Engineering, University of Shanghai for Science and Technology.
    Xiong, Naixue
    School of Optical-Electrical and Computer Engineering, University of Shanghai for Science and Technology.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    A digital noise reduction scheme in communication systems for internet of things2016In: Journal of Internet Technology, ISSN 1607-9264, E-ISSN 2079-4029, Vol. 17, no 5, p. 879-887Article in journal (Refereed)
    Abstract [en]

    Data-driven computing and using data for strategic advantages are exemplified by communication systems, and the speech intelligibility in communication systems is generally interrupted by interfering noise. This interference comes from the environmental noise, so it can reduce them intelligibility by masking the interested signal. An important work in communication systems is to extract speech from noisy speech and inhibiting background noise. The primary purpose of speech noise reduction system is to extract pure speech from speech signal with noise. The focus of this paper is to build a new noise reduction system on the basis of the optimization of digital noise reduction algorithms. According to the program simulation results based on MATLAB, the digital noise reduction system has many improved performances in the low SNR and achieves more than 5dB-15dB on noise reduction. The combined algorithm was tested under different noise conditions, and data display that the optimize performance of algorithm achieve the best. The simulation results demonstrate that it can get nearly three times better than the other two algorithms. The output signal of combined algorithm are very close to the pure speech signal, the performance of restore the voice signal is better than the other two algorithms

  • 232.
    Zhu, Xudong
    et al.
    Shanghai Key Laboratory of Scalable Computing and Systems, Department of Computer Science and Engineering, Shanghai Jiao Tong University.
    Li, Jun
    Shanghai Key Laboratory of Scalable Computing and Systems, Department of Computer Science and Engineering, Shanghai Jiao Tong University.
    Gao, Xiaofeng
    Shanghai Key Laboratory of Scalable Computing and Systems, Department of Computer Science and Engineering, Shanghai Jiao Tong University.
    Wu, Fan
    Shanghai Key Laboratory of Scalable Computing and Systems, Department of Computer Science and Engineering, Shanghai Jiao Tong University.
    Chen, Guihai
    Shanghai Key Laboratory of Scalable Computing and Systems, Department of Computer Science and Engineering, Shanghai Jiao Tong University.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    A distributed approximation for multi-hop clustering problem in wireless sensor networks2016In: 2015 IEEE Global Communications Conference (GLOBECOM): San Diego, CA, 6-10 Dec 2015, Piscataway, NJ: IEEE Communications Society, 2016, article id 7416941Conference paper (Refereed)
    Abstract [en]

    In wireless sensor networks (WSNs), there is no predefined infrastructure. Nodes need to frequently flood messages to discover routes, which badly decreases the network performance. To overcome such drawbacks, WSNs are often grouped into several disjointed clusters, each with a representative cluster head (CH) in charge of the routing process. In order to further improve the efficiency of WSNs, it is crucial to find a cluster partition with minimum number of clusters and the distance between each node to its corresponding CH can be bounded by a constant number of hops. Finding such a partition is defined as minimum d-hop cluster head set (d-MCHS) problem, which is proved to be NP-hard. In this paper, we propose a distributed approximation algorithm, named d^2-Cluster, to address d-MCHS problem and prove that the approximation ratio of d^2-Cluster under unit disk graph (UDG) is a constant factor \lambda which is related to d. To the best of our knowledge, it is the first constant approximation ratio for d-MDS problem in UDG

  • 233.
    Zhu, Nanhao
    et al.
    CETC Group, GCI Science and Technology Co.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    A generic framework for energy evaluation on wireless sensor networks2016In: Wireless networks, ISSN 1022-0038, E-ISSN 1572-8196, Vol. 22, no 4, p. 1199-1220Article in journal (Refereed)
    Abstract [en]

    Due to reliance on batteries, energy consumption has always been of significant concern for sensor node networks. This work presents the design and implementation of a house-build experimental platform, named Energy Management System for Wireless Sensor Networks (EMrise) for the energy management and exploration on wireless sensor networks. Consisting of three parts, the SystemC-based simulation environment of EMrise enables the HW/SW co-simulation for energy evaluation on heterogeneous sensor networks. The hardware platform of EMrise is further designed to facilitate the realistic energy consumption measurement and calibration as well as accurate energy exploration. In the meantime, a generic genetic algorithm based optimization framework of EMrise is also implemented to automatically, quickly and intelligently fine tune hundreds of possible solutions for the given task to find the best suitable energy-aware tradeoffs

  • 234.
    Perera, Charith
    et al.
    Centre for Research in Computing, The Open University, Milton Keynes.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    A Knowledge-Based Resource Discovery for Internet of Things2016In: Knowledge-Based Systems, ISSN 0950-7051, E-ISSN 1872-7409, Vol. 109, p. 122-136Article in journal (Refereed)
    Abstract [en]

    In the sensing as a service paradigm, Internet of Things (IoT) Middleware platforms allow data consumers to retrieve the data they want without knowing the underlying technical details of IoT resources (i.e. sensors and data processing components). However, configuring an IoT middleware platform and retrieving data is a significant challenge for data consumers as it requires both technical knowledge and domain expertise. In this paper, we propose a knowledge driven approach called Context Aware Sensor Configuration Model (CASCOM) to simplify the process of configuring IoT middleware platforms, so the data consumers, specifically non-technical personnel, can easily retrieve the data they required. In this paper, we demonstrate how IoT resources can be described using semantics in such away that they can later be used to compose service work-flows. Such automated semantic-knowledge-based IoT resource composition approach advances the current research. We demonstrate the feasibility and the usability of our approach through a prototype implementation based on an IoT middleware called Global Sensor Networks (GSN), though our model can be generalized to any other middleware platform.

  • 235.
    Lin, Bing
    et al.
    College of Mathematics and Computer Sciences, Fuzhou University.
    Guo, Wenzhong
    College of Mathematics and Computer Sciences, Fuzhou University.
    Xiong, Naixue
    School of Optical-Electrical and Computer Engineering, University of Shanghai for Science and Technology.
    Chen, Guolong
    College of Mathematics and Computer Sciences, Fuzhou University.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Zhang, Hong
    College of Mathematics and Computer Sciences, Fuzhou University.
    A Pretreatment Workflow Scheduling Approach for Big Data Applications in Multi-cloud Environments2016In: IEEE Transactions on Network and Service Management, ISSN 1932-4537, E-ISSN 1932-4537, Vol. 13, no 3, p. 581-594Article in journal (Refereed)
    Abstract [en]

    The rapid development of the latest distributed computing paradigm, i.e., cloud computing, generates a highly fragmented cloud market composed of numerous cloud providers and offers tremendous parallel computing ability to handle Big Data problems. One of the biggest challenges in Multi-clouds is efficient workflow scheduling. Although the workflow scheduling problem has been studied extensively, there are still very few primal works tailored for Multi-cloud environments. Moreover, the existing research works either fail to satisfy the Quality of Service (QoS) requirements, or do not consider some fundamental features of cloud computing such as heterogeneity and elasticity of computing resources. In this paper, a scheduling algorithm which is called Multi-Clouds Partial Critical Paths with Pretreatment (MCPCPP) for Big Data workflows in Multi-clouds is presented. This algorithm incorporates the concept of Partial Critical Paths, and aims to minimize the execution cost of workflow while satisfying the defined deadline constraint. Our approach takes into considerations the essential characteristics of Multi-clouds such as the charge per time interval, various instance types from different cloud providers as well as homogeneous intra-bandwidth vs. heterogeneous inter-bandwidth. Various types of workflows are used for evaluation purpose and our experimental results show that the MCPCPP is promising.

  • 236.
    Kostenius, Catrine
    et al.
    Luleå University of Technology, Department of Health Sciences, Health and Rehab.
    Hallberg, Josef
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Lindqvist, Anna-Karin
    Luleå University of Technology, Department of Health Sciences, Health and Rehab.
    A slice of the win-win game: Swedish schoolchildren’s ideas on gamification to promote physical activity and cognitive ability2016Conference paper (Other academic)
  • 237.
    Moberg, Carl
    et al.
    Cisco.
    Wallin, Stefan
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    A two-layered data model approach for network services2016In: IEEE Communications Magazine, ISSN 0163-6804, E-ISSN 1558-1896, Vol. 54, no 3, p. 76-80Article in journal (Refereed)
    Abstract [en]

    Connectivity services are ubiquitous to enterprises, and many enterprises are looking to outsource basic networking services traditionally implemented using on-premise network equipment. The rising expectations on service providers to rapidly change the definition of services and the ability to introduce new types of network elements is leading to exploding complexity in the orchestration layer. The severity of this problem is such that the ability to introduce new services and new device vendors in the network is reduced due to thetime and cost associated with such changes.

  • 238.
    Fong, Simon
    et al.
    Department of Computer and Information Science, University of Macau.
    Wong, Raymond K.
    School of Computer Science and Engineering, University of New South Wales, Sydney.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Accelerated PSO Swarm Search Feature Selection for Data Stream Mining Big Data2016In: IEEE Transactions on Services Computing, ISSN 1939-1374, E-ISSN 1939-1374, Vol. 9, no 1, p. 33-45Article in journal (Refereed)
    Abstract [en]

    Big Data though it is a hype up-springing many technical challenges that confront both academic research communities and commercial IT deployment, the root sources of Big Data are founded on data streams and the curse of dimensionality. It is generally known that data which are sourced from data streams accumulate continuously making traditional batch-based model induction algorithms infeasible for real-time data mining. Feature selection has been popularly used to lighten the processing load in inducing a data mining model. However, when it comes to mining over high dimensional data the search space from which an optimal feature subset is derived grows exponentially in size, leading to an intractable demand in computation. In order to tackle this problem which is mainly based on the high-dimensionality and streaming format of data feeds in Big Data, a novel lightweight feature selection is proposed. The feature selection is designed particularly for mining streaming data on the fly, by using accelerated particle swarm optimization (APSO) type of swarm search that achieves enhanced analytical accuracy within reasonable processing time. In this paper, a collection of Big Data with exceptionally large degree of dimensionality are put under test of our new feature selection algorithm for performance evaluation.

  • 239.
    Moetesum, Momina
    et al.
    Department of Computer Science, Bahria University, Islamabad.
    Hadi, Fazle
    Department of Computer Science, Bahria University, Islamabad.
    Imran, Muhammad Al
    College of Computer and Information Sciences, Almuzahmiyah, King Saud University.
    Minhas, Abid Ali
    Department of Computer Science, Bahria University, Islamabad.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    An adaptive and efficient buffer management scheme for resource-constrained delay tolerant networks2016In: Wireless networks, ISSN 1022-0038, E-ISSN 1572-8196, Vol. 22, no 7, p. 2189-2201Article in journal (Refereed)
    Abstract [en]

    Provisioning buffer management mechanism is especially crucial in resource-constrained delay tolerant networks (DTNs) as maximum data delivery ratio with minimum overhead is expected in highly congested environments. However, most DTN protocols do not consider resource limitations (e.g., buffer, bandwidth) and hence, results in performance degradation. To strangle and mitigate the impact of frequent buffer overflows, this paper presents an adaptive and efficient buffer management scheme called size-aware drop (SAD) that strives to improve buffer utilization and avoid unnecessary message drops. To improve data delivery ratio, SAD exactly determines the requirement based on differential of newly arrived message(s) and available space. To vacate inevitable space from a congested buffer, SAD strives to avoid redundant message drops and deliberate to pick and discard most appropriate message(s) to minimize overhead. The performance of SAD is validated through extensive simulations in realistic environments (i.e., resource-constrained and congested) with different mobility models (i.e., Random Waypoint and disaster). Simulation results demonstrate the performance supremacy of SAD in terms of delivery probability and overhead ratio besides other metrics when compared to contemporary schemes based on Epidemic (DOA and DLA) and PRoPHET (SHLI and MOFO).

  • 240.
    Javaid, Nadeem
    et al.
    COMSATS Institute of Information Technology, Islamabad.
    Shah, Mehreen
    Allama Iqbal Open University, Islamabad.
    Ahmad, Ashfaq
    COMSATS Institute of Information Technology, Islamabad.
    Imran, Muhammad Al
    College of Computer and Information Sciences, Almuzahmiyah, King Saud University.
    Khan, Majid Iqbal
    COMSATS Institute of Information Technology, Islamabad.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    An Enhanced Energy Balanced Data Transmission Protocol for Underwater Acoustic Sensor Networks2016In: Sensors, ISSN 1424-8220, E-ISSN 1424-8220, Vol. 16, no 4, article id 487Article in journal (Refereed)
    Abstract [en]

    This paper presents two new energy balanced routing protocols for Underwater Acoustic Sensor Networks (UASNs); Efficient and Balanced Energy consumption Technique (EBET) and Enhanced EBET (EEBET). The first proposed protocol avoids direct transmission over long distance to save sufficient amount of energy consumed in the routing process. The second protocol overcomes the deficiencies in both Balanced Transmission Mechanism (BTM) and EBET techniques. EBET selects relay node on the basis of optimal distance threshold which leads to network lifetime prolongation. The initial energy of each sensor node is divided into energy levels for balanced energy consumption. Selection of high energy level node within transmission range avoids long distance direct data transmission. The EEBET incorporates depth threshold to minimize the number of hops between source node and sink while eradicating backward data transmissions. The EBET technique balances energy consumption within successive ring sectors, while, EEBET balances energy consumption of the entire network. In EEBET, optimum number of energy levels are also calculated to further enhance the network lifetime. Effectiveness of the proposed schemes is validated through simulations where these are compared with two existing routing protocols in terms of network lifetime, transmission loss, and throughput. The simulations are conducted under different network radii and varied number of nodes.

  • 241.
    Wang, You
    et al.
    Institute for Network Sciences and Cyberspace, Tsinghua University.
    Bi, Jun
    Institute for Network Sciences and Cyberspace, Tsinghua University.
    Vasilakos, Athanasios
    University of Western Macedonia.
    An Identifier-Based Approach to Internet Mobility: A Survey2016In: IEEE Network, ISSN 0890-8044, E-ISSN 1558-156X, Vol. 30, no 1, p. 72-79Article in journal (Refereed)
  • 242.
    Nugent, Chris
    et al.
    Ulster University.
    Cleland, Ian
    Ulster University.
    Santanna, Anita
    Halmstad University.
    Espinilla, Macarena
    University of Jaén.
    Synnott, Jonathan
    Ulster University.
    Banos, Oresti
    University of Twente.
    Lundström, Jens
    Halmstad Universitet.
    Hallberg, Josef
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Calzada, Alberto
    Ulster University.
    An initiative for the creation of open datasets within pervasive healthcare2016In: Proceedings of the 10th EAI International Conference onPervasive Computing Technologies for Healthcare: 16-19 May 2016, Cancun, Mexico, ICST, the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering , 2016, p. 318-321Conference paper (Refereed)
    Abstract [en]

    In this paper issues surrounding the collection, annotation, management and sharing of data gathered from pervasive health systems are presented. The overarching motivation for this work has been to provide an approach whereby annotated data sets can be made readily accessible to the research community in an effort to assist the advancement of the state-of-the-art in activity recognition and behavioural analysis using pervasive health systems. Recommendations of how this can be made a reality are presented in addition to the initial steps which have been taken to facilitate such an initiative involving the definition of common formats for data storage and a common set of tools for data processing and visualization.

  • 243.
    Klimova, Alexandra
    et al.
    ITMO University.
    Rondeau, Eric
    Université de Lorraine, Nancy.
    Andersson, Karl
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science. Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Distance- Spanning Technology.
    Porras, Jari
    Lappeenranta University of Technology.
    Rybin, Andrei
    ITMO University.
    Zaslavsky, Arkady
    CSIRO, Australia.
    An international Master’s program in green ICT as a contribution to sustainable development2016In: Journal of Cleaner Production, ISSN 0959-6526, E-ISSN 1879-1786, Vol. 135, p. 223-239Article in journal (Refereed)
    Abstract [en]

    Various principles of sustainable development are currently being integrated into national policies and programs. Such principles relate to a range of aspects of human activities requiring urgent attention, including heating, mobility, food security, and sustainable agriculture. One of the fields contributing to the transition towards a sustainable society is that of green information and communication technologies (ICT). Therefore, the implementation of educational programs in green ICT is important in ensuring further ICT development around sustainability concerns. This article describes the development of an international Master's degree program named “Pervasive computing and communications for sustainable development” (PERCCOM) by an international consortium, which aimed to combine advanced ICT with environmental, economic, and social awareness. The article presents background information regarding the role of the ICT sector in environmental challenges, and a review of the literature, in order to understand what is required of ICT and green ICT graduates. The curriculum of the PERCCOM program is then described and findings on program improvement are reported. The article is aimed at academic and research professionals in the fields of sustainable development and green technologies, with the goal of improving educational initiatives to address the societal demand for sustainable development. The findings reported here contribute toward the search for a solution for sustainability, especially regarding environmental issues, among educating professionals with high expertise in networking, computing, and programming, who are able to design, develop, deploy, and maintain both pervasive computing systems and communication architectures for sustainable development.

  • 244.
    Mashayekhy, Lena
    et al.
    Department of Computer Science, Wayne State University, Detroit.
    Nejad, Mahyar Movahed
    Department of Computer Science, Wayne State University, Detroit.
    Grosu, Daniel
    Department of Computer Science, Wayne State University, Detroit.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    An Online Mechanism for Resource Allocation and Pricing in Clouds2016In: I.E.E.E. transactions on computers (Print), ISSN 0018-9340, E-ISSN 1557-9956, Vol. 65, no 4, p. 1172-1184Article in journal (Refereed)
    Abstract [en]

    Cloud providers provision their various resources such as CPUs, memory, and storage in the form of virtual machine (VM) instances which are then allocated to the users. The users are charged based on a pay-as-you-go model, and their payments should be determined by considering both their incentives and the incentives of the cloud providers. Auction markets can capture such incentives, where users name their own prices for their requested VMs. We design an auction-based online mechanism for VM provisioning, allocation, and pricing in clouds that considers several types of resources. Our proposed online mechanism makes no assumptions about future demand of VMs, which is the case in real cloud settings. The proposed online mechanism is invoked as soon as a user places a request or some of the allocated resources are released and become available. The mechanism allocates VM instances to selected users for the period they are requested for, and ensures that the users will continue using their VM instances for the entire requested period. In addition, the mechanism determines the payment the users have to pay for using the allocated resources. We prove that the mechanism is incentive-compatible, that is, it

  • 245.
    Zhang, Weizhe
    et al.
    School of Computer Science and Technology, Harbin Institute of Technology.
    Li, Xiong
    School of Computer Science and Technology, Harbin Institute of Technology.
    Xiong, Naixue
    Department of Business and Computer Science, Southwestern Oklahoma State University, Oklahoma, OK.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Android platform-based individual privacy information protection system2016In: Personal and Ubiquitous Computing, ISSN 1617-4909, E-ISSN 1617-4917, Vol. 20, no 6, p. 875-884Article in journal (Refereed)
    Abstract [en]

    With the popularity of mobile phones with Android platform, Android platform-based individual privacy information protection has been paid more attention to. In consideration of individual privacy information problem after mobile phones are lost, this paper tried to use SMS for remote control of mobile phones and providing comprehensive individual information protection method for users and completed a mobile terminal system with self-protection characteristics. This system is free from the support of the server and it can provide individual information protection for users by the most basic SMS function, which is an innovation of the system. Moreover, the protection mechanism of the redundancy process, trusted number mechanism and SIM card detection mechanism are the innovations of this system. Through functional tests and performance tests, the system could satisfy user functional and non-functional requirements, with stable operation and high task execution efficiency

  • 246.
    Idowu, Samuel
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Saguna, Saguna
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Åhlund, Christer
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Schelén, Olov
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Applied Machine Learning: Forecasting Heat Load in District Heating System2016In: Energy and Buildings, ISSN 0378-7788, E-ISSN 1872-6178, Vol. 133, p. 478-488Article in journal (Refereed)
    Abstract [en]

    Forecasting energy consumption in buildings is a key step towards the realization of optimized energy production, distribution and consumption. This paper presents a data driven approach for analysis and forecast of aggregate space and water thermal load in buildings. The analysis and the forecast models are built using district heating data unobtrusively collected from ten residential and commercial buildings located in Skellefteå, Sweden. The load forecast models are generated using supervised machine learning techniques, namely, support vector machine, regression tree, feed forward neural network, and multiple linear regression. The model takes the outdoor temperature, historical values of heat load, time factor variables and physical parameters of district heating substations as its input. A performance comparison among the machine learning methods and identification of the importance of models input variables is carried out. The models are evaluated with varying forecast horizons of every hour from 1 up to 48 hours. Our results show that support vector machine, feed forward neural network and multiple linear regression are more suitable machine learning methods with lower performance errors than the regression tree. Support vector machine has the least normalized root mean square error of 0.07 for a forecast horizon of 24 hour.

  • 247. Nanda, Rohan
    et al.
    Saguna, Saguna
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Mitra, Karan
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Åhlund, Christer
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    BayesForSG: A Bayesian Model for Forecasting Thermal Load in Smart Grids2016In: SAC '16: Proceedings of the 31st Annual ACM Symposium on Applied Computing, New York: ACM Digital Library, 2016, p. 2135-2141Conference paper (Refereed)
    Abstract [en]

    Forecasting the thermal load demand for residential buildings assists in optimizing energy production and developing demand response strategies in a smart grid system. However, the presence of a large number of factors such as outdoor temperature, district heating operational parameters, building characteristics and occupant behavior, make thermal load forecasting a challenging task. This paper presents an efficient model for thermal load forecast in buildings with different variations of heat load consumption across both winter and spring seasons using a Bayesian Network. The model has been validated by utilizing the realistic district heating data of three residential buildings from the district heating grid of the city of Skellefteå, Sweden over a period of four months. The results from our model shows that the current heat load consumption and outdoor temperature forecast have the most influence on the heat load forecast. Further, our model outperforms state-of-the-art methods for heat load forecasting by achieving a higher average accuracy of 77.97% by utilizing only 10% of the training data for a forecast horizon of 1 hour.

  • 248.
    Tsai, Chun-Wei
    et al.
    Department of Computer Science and Information Engineering, National Ilan University, Yilan.
    Lai, Chin-Feng
    Institute of Computer Science and Information Engineering, National Chung Cheng University, Chia-Yi.
    Chao, Han-Chieh
    Department of Computer Science and Information Engineering, National Ilan University, Yilan.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Big Data Analytics2016In: Big Data Technologies and Applications, Springer International Publishing , 2016, p. 13-52Chapter in book (Refereed)
    Abstract [en]

    The age of big data is now coming. But the traditional data analytics may not be able to handle such large quantities of data. The question that arises now is, how to develop a high performance platform to efficiently analyze big data and how to design an appropriate mining algorithm to find the useful things from big data. To deeply discuss this issue, this paper begins with a brief introduction to data analytics, followed by the discussions of big data analytics. Some important open issues and further research directions will also be presented for the next step of big data analytics.

  • 249.
    Yaqoob, Ibrar
    et al.
    Centre for Mobile Cloud Computing Research (C4MCCR), Faculty of Computer Science and Information Technology, University of Malaya.
    Abaker Targio Hashem, Ibrahim
    Centre for Mobile Cloud Computing Research (C4MCCR), Faculty of Computer Science and Information Technology, University of Malaya.
    Gani, Abdullah
    Centre for Mobile Cloud Computing Research (C4MCCR), Faculty of Computer Science and Information Technology, University of Malaya.
    Mokhtar, Salimah
    Centre for Mobile Cloud Computing Research (C4MCCR), Faculty of Computer Science and Information Technology, University of Malaya.
    Ahmed, Ejaz
    Centre for Mobile Cloud Computing Research (C4MCCR), Faculty of Computer Science and Information Technology, University of Malaya.
    Badrul Anuar, Nor
    Centre for Mobile Cloud Computing Research (C4MCCR), Faculty of Computer Science and Information Technology, University of Malaya.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Big data: From beginning to future2016In: International Journal of Information Management, ISSN 0268-4012, E-ISSN 1873-4707, Vol. 36, no 6B, p. 1231-1247Article in journal (Refereed)
    Abstract [en]

    Big data is a potential research area receiving considerable attention from academia and IT communities. In the digital world, the amounts of data generated and stored have expanded within a short period of time. Consequently, this fast growing rate of data has created many challenges. In this paper, we use structuralism and functionalism paradigms to analyze the origins of big data applications and its current trends. This paper presents a comprehensive discussion on state-of-the-art big data technologies based on batch and stream data processing. Moreover, strengths and weaknesses of these technologies are analyzed. This study also discusses big data analytics techniques, processing methods, some reported case studies from different vendors, several open research challenges, and the opportunities brought about by big data. The similarities and differences of these techniques and technologies based on important parameters are also investigated. Emerging technologies are recommended as a solution for big data problems.

  • 250.
    Yu, Yong
    et al.
    Big Data Research Center, School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu.
    Xue, Liang
    Department of Computing, The Hong Kong Polytechnic University.
    Au, Man Ho
    Department of Computing, The Hong Kong Polytechnic University.
    Susilo, Willy
    Center for Computer and Information Security Research, School of Computing and Information Technology, University of Wollongong.
    Ni, Jianbin
    Big Data Research Center, School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu.
    Zhang, Yafang
    Big Data Research Center, School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Shen, Jian
    School of Computer and Software at Nanjing University of Information Science and Technology, Nanjing.
    Cloud data integrity checking with an identity-based auditing mechanism from RSA2016In: Future generations computer systems, ISSN 0167-739X, E-ISSN 1872-7115, Vol. 62, p. 85-91Article in journal (Refereed)
    Abstract [en]

    Cloud data auditing is extremely essential for securing cloud storage since it enables cloud users to verify the integrity of their outsourced data efficiently. The computation overheads on both the cloud server and the verifier can be significantly reduced by making use of data auditing because there is no necessity to retrieve the entire file but rather just use a spot checking technique. A number of cloud data auditing schemes have been proposed recently, but a majority of the proposals are based on Public Key Infrastructure (PKI). There are some drawbacks in these protocols: (1) It is mandatory to verify the validity of public key certificates before using any public key, which makes the verifier incur expensive computation cost. (2) Complex certificate management makes the whole protocol inefficient. To address the key management issues in cloud data auditing, in this paper, we propose ID-CDIC, an identity-based cloud data integrity checking protocol which can eliminate the complex certificate management in traditional cloud data integrity checking protocols. The proposed concrete construction from RSA signature can support variable-sized file blocks and public auditing. In addition, we provide a formal security model for ID-CDIC and prove the security of our construction under the RSA assumption with large public exponents in the random oracle model. We demonstrate the performance of our proposal by developing a prototype of the protocol. Implementation results show that the proposed ID-CDIC protocol is very practical and adoptable in real life.

2345678 201 - 250 of 793
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf