Change search
Refine search result
1234567 101 - 150 of 803
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 101.
    Laine, Teemu H.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Mobile Educational Augmented Reality Games: A Systematic Literature Review and Two Case Studies2018In: Computers, E-ISSN 2073-431X, Vol. 7, no 1, article id 19Article in journal (Refereed)
    Abstract [en]

    Augmented reality (AR) has evolved from research projects into mainstream applications that cover diverse fields, such as entertainment, health, business, tourism and education. In particular, AR games, such as Pokémon Go, have contributed to introducing the AR technology to the general public. The proliferation of modern smartphones and tablets with large screens, cameras, and high processing power has ushered in mobile AR applications that can provide context-sensitive content to users whilst freeing them to explore the context. To avoid ambiguity, I define mobile AR as a type of AR where a mobile device (smartphone or tablet) is used to display and interact with virtual content that is overlaid on top of a real-time camera feed of the real world. Beyond being mere entertainment, AR and games have been shown to possess significant affordances for learning. Although previous research has done a decent job of reviewing research on educational AR applications, I identified a need for a comprehensive review on research related to educational mobile AR games (EMARGs). This paper explored the research landscape on EMARGs over the period 2012–2017 through a systematic literature review complemented by two case studies in which the author participated. After a comprehensive literature search and filtering, I analyzed 31 EMARGs from the perspectives of technology, pedagogy, and gaming. Moreover, I presented an analysis of 26 AR platforms that can be used to create mobile AR applications. I then discussed the results in depth and synthesized my interpretations into 13 guidelines for future EMARG developers.

  • 102.
    Cai, H.
    et al.
    School of Software, Shanghai JiaoTong University, Shanghai, China.
    Gu, Y.
    School of Software, Shanghai JiaoTong University, Shanghai, China.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Xu, B.
    College of Economics and Management, Shanghai JiaoTong University, Shanghai, China.
    Zhou, J.
    School of Software, Shanghai JiaoTong University, Shanghai, China.
    Model-Driven Development Patterns for Mobile Services in Cloud of Things2018In: IEEE Transactions On Cloud Computing, ISSN 2168-7161, Vol. 6, no 3, p. 771-784, article id 7399727Article in journal (Refereed)
    Abstract [en]

    Cloud of Things (CoT) is an integration of Internet of Things (IoT) and cloud computing for intelligent and smart application especially in mobile environment. Model Driven Architecture (MDA) is used to develop Software as a Service (SaaS) so as to facilitate mobile application development by relieving developers from technical details. However, traditional service composition or mashup are somewhat unavailable due to complex relations and heterogeneous deployed environments. For the purpose of building cloud-enabled mobile applications in a configurable and adaptive way, Model-Driven Development Patterns based on semantic reasoning mechanism are provided towards CoT application development. Firstly, a meta-model covering both multi-view business elements and service components are provided for model transformation. Then, based on formal representation of models, three patterns from different tiers of Model-View-Controller (MVC) framework are used to transform business models into service component system so as to configure cloud services rapidly. Lastly, a related software platform is also provided for verification. The result shows that the platform is applicable for rapid system development by means of various service integration patterns. 

  • 103.
    Rahimi, M. Reza
    et al.
    Huawei Innovation Center, US R&D Storage Lab, Santa Clara.
    Venkatasubramanian, Nalini
    School of Information and Computer Science, University of California, Irvine.
    Mehrotra, Sharad
    School of Information and Computer Science, University of California, Irvine.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    On Optimal and Fair Service Allocation in Mobile Cloud Computing2018In: I E E E Transactions on Cloud Computing, ISSN 2168-7161, Vol. 6, no 3, p. 815-828Article in journal (Refereed)
    Abstract [en]

    This paper studies the optimal and fair service allocation for a variety of mobile applications (single or group and collaborative mobile applications) in mobile cloud computing. We exploit the observation that using tiered clouds, i.e. clouds at multiple levels (local and public) can increase the performance and scalability of mobile applications. We proposed a novel framework to model mobile applications as a location-time workflows (LTW) of tasks; here users mobility patterns are translated to mobile service usage patterns. We show that an optimal mapping of LTWs to tiered cloud resources considering multiple QoS goals such application delay, device power consumption and user cost/price is an NP-hard problem for both single and group-based applications. We propose an efficient heuristic algorithm called MuSIC that is able to perform well (73% of optimal, 30% better than simple strategies), and scale well to a large number of users while ensuring high mobile application QoS. We evaluate MuSIC and the 2-tier mobile cloud approach via implementation (on real world clouds) and extensive simulations using rich mobile applications like intensive signal processing, video streaming and multimedia file sharing applications. We observe about 25% lower delays and power (under fixed price constraints) and about 35% decrease in price (considering fixed delay) in comparison to only using the public cloud. Our studies also show that MuSIC performs quite well under different mobility patterns, e.g. random waypoint and Manhattan models.

  • 104.
    Abedin, Md. Zainal
    et al.
    University of Science and Technology, Chittagong.
    Siddiquee, Kazy Noor E Alam
    University of Science and Technology Chittagong.
    Bhuyan, M. S.
    University of Science & Technology Chittagong.
    Karim, Razuan
    University of Science and Technology Chittagong.
    Hossain, Mohammad Shahadat
    University of Chittagong, Bangladesh.
    Andersson, Karl
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Performance Analysis of Anomaly Based Network Intrusion Detection Systems2018In: Proveedings of the 43nd IEEE Conference on Local Computer Networks Workshops (LCN Workshops), Piscataway, NJ: IEEE Computer Society, 2018, p. 1-7Conference paper (Refereed)
    Abstract [en]

    Because of the increased popularity and fast expansion of the Internet as well as Internet of things, networks are growing rapidly in every corner of the society. As a result, huge amount of data is travelling across the computer networks that lead to the vulnerability of data integrity, confidentiality and reliability. So, network security is a burning issue to keep the integrity of systems and data. The traditional security guards such as firewalls with access control lists are not anymore enough to secure systems. To address the drawbacks of traditional Intrusion Detection Systems (IDSs), artificial intelligence and machine learning based models open up new opportunity to classify abnormal traffic as anomaly with a self-learning capability. Many supervised learning models have been adopted to detect anomaly from networks traffic. In quest to select a good learning model in terms of precision, recall, area under receiver operating curve, accuracy, F-score and model built time, this paper illustrates the performance comparison between Naïve Bayes, Multilayer Perceptron, J48, Naïve Bayes Tree, and Random Forest classification models. These models are trained and tested on three subsets of features derived from the original benchmark network intrusion detection dataset, NSL-KDD. The three subsets are derived by applying different attributes evaluator’s algorithms. The simulation is carried out by using the WEKA data mining tool.

  • 105.
    Cruciani, Federico
    et al.
    Ulster University.
    Cleland, Ian
    Ulster University.
    Nugent, Chris
    Ulster University.
    McCullagh, Paul
    Ulster University.
    Synnes, Kåre
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Hallberg, Josef
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Personalized Online Training for Physical Activity monitoring using weak labels2018In: 2018 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops), IEEE, 2018, p. 567-572Conference paper (Refereed)
    Abstract [en]

    The use of smartphones for activity recognition is becoming common practice. Most approaches use a single pretrained classifier to recognize activities for all users. Research studies, however, have highlighted how a personalized trained classifier could provide better accuracy. Data labeling for ground truth generation, however, is a time-consuming process. The challenge is further exacerbated when opting for a personalized approach that requires user specific datasets to be labeled, making conventional supervised approaches unfeasible. In this work, we present early results on the investigation into a weakly supervised approach for online personalized activity recognition. This paper describes: (i) a heuristic to generate weak labels used for personalized training, (ii) a comparison of accuracy obtained using a weakly supervised classifier against a conventional ground truth trained classifier. Preliminary results show an overall accuracy of 87% of a fully supervised approach against a 74% with the proposed weakly supervised approach.

  • 106.
    Dadhich, Siddharth
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Sandin, Fredrik
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Embedded Internet Systems Lab.
    Bodin, Ulf
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Predicting bucket-filling control actions of a wheel-loader operator using aneural network ensemble2018In: 2018 International Joint Conference on Neural Networks (IJCNN), Piscataway, NJ: IEEE, 2018, article id 8489388Conference paper (Refereed)
    Abstract [en]

    Automatic bucket filling is an open problem since three decades. In this paper, we address this problem with supervised machine learning using data collected from manual operation. The range-normalized actuations of lift joystick, tilt joystick and throttle pedal are predicted using information from sensors on the machine and the prediction errors are quantified. We apply linear regression, k-nearest neighbors, neural networks, regression trees and ensemble methods and find that an ensemble of neural networks results in the most accurate predictions. The prediction root-mean-square-error (RMSE) of the lift action exceeds that of the tilt and throttle actions, and we obtain an RMSE below 0.2 for complete bucket fillings after training with as little as 135 bucket filling examples

  • 107.
    Bezerra, Nibia Souza
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Wang, Min
    Network Architecture and Protocols Research, Ericsson .
    Åhlund, Christer
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Nordberg, Mats
    Network Architecture and Protocols Research, Ericsson .
    Schelén, Olov
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    RACH performance in massive machine-type communications access scenario2018Conference paper (Refereed)
    Abstract [en]

    With the increasing number of devices performing Machine-Type Communications (MTC), mobile networks are expected to encounter a high load of burst transmissions. One bottleneck in such cases is the Random Access Channel (RACH) procedure, which is responsible for the attachment of devices, among other things. In this paper, we performed a rich-parameter based simulation on RACH to identify the procedure bottlenecks. A finding from the studied scenarios is that the Physical Downlink Control Channel (PDCCH) capacity for the grant allocation is the main limitation for the RACH capacity rather than the number of Physical Random Access Channel (PRACH) preambles. Guided by our simulation results, we proposed improvements to the RACH procedure and to PDCCH.

  • 108.
    Souza Bezerra, Níbia
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Wang, Min
    Luleå University of Technology, External.
    Åhlund, Christer
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering.
    Nordberg, Mats
    Luleå University of Technology, External.
    Schelén, Olov
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    RACH performance in massive machine-type communications access scenario2018In: / [ed] IEEE, 2018Conference paper (Refereed)
    Abstract [en]

    With the increasing number of devices performing Machine-Type Communications (MTC), mobile networks are expected to encounter a high load of burst transmissions. One bottleneck in such cases is the Random Access Channel (RACH) procedure, which is responsible for the attachment of devices, among other things. In this paper, we performed a rich-parameter based simulation on RACH to identify the procedure bottlenecks. A finding from the studied scenarios is that the Physical Downlink Control Channel (PDCCH) capacity for the grant allocation is the main limitation for the RACH capacity rather than the number of Physical Random Access Channel (PRACH) preambles. Guided by our simulation results, we proposed improvements to the RACH procedure and to PDCCH.

  • 109.
    Zhohov, Roman
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Minovski, Dimitar
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science. InfoVista Sweden.
    Johansson, Per
    InfoVista Sweden.
    Andersson, Karl
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Real-time Performance Evaluation of LTE for IIoT2018In: Proceedings of the 43rd IEEE Conference on Local Computer Networks (LCN) / [ed] Soumaya Cherkaoui, Institute of Electrical and Electronics Engineers (IEEE), 2018Conference paper (Refereed)
    Abstract [en]

    Industrial Internet of Things (IIoT) is claimed to be a global booster technology for economic development. IIoT brings bulky use-cases with a simple goal of enabling automation, autonomation or just plain digitalization of industrial processes. The abundance of interconnected IoT and CPS generate additional burden on the telecommunication networks, imposing number of challenges to satisfy the key performance requirements. In particular, the QoS metrics related to real-time data exchange for critical machine-to-machine type communication. This paper analyzes a real-world example of IIoT from a QoS perspective, such as remotely operated underground mining vehicle. As part of the performance evaluation, a software tool is developed for estimating the absolute, one-way delay in end-toend transmissions. The measured metric is passed to a machine learning model for one-way delay prediction based on LTE RAN measurements using a commercially available cutting-edge software tool. The achieved results prove the possibility to predict the delay figures using machine learning model with a coefficient of determination up to 90%.

  • 110.
    Du, Wei
    et al.
    Key Laboratory of Advanced Control and Optimization for Chemical Processes, Ministry of Education, East China University of Science and Technology, Shanghai , Institute of Textiles and Clothing, The Hong Kong Polytechnic University.
    Tang, Yang
    Key Laboratory of Advanced Control and Optimization for Chemical Processes, Ministry of Education, East China University of Science and Technology, Shanghai.
    Leung, Sunney Yung Sun
    Institute of Textile and Clothing, The Hong Kong Polytechnic University.
    Tong, Le
    Institute of Textile and Clothing, The Hong Kong Polytechnic University.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Qian, Feng
    Key Laboratory of Advanced Control and Optimization for Chemical Processes, Ministry of Education, East China University of Science and Technology, Shanghai , Institute of Textiles and Clothing, The Hong Kong Polytechnic University.
    Robust Order Scheduling in the Discrete Manufacturing Industry: a Multiobjective Optimization Approach2018In: IEEE Transactions on Industrial Informatics, ISSN 1551-3203, E-ISSN 1941-0050, Vol. 14, no 1, p. 253-264, article id 7842622Article in journal (Refereed)
    Abstract [en]

    Order scheduling is of vital importance in discrete manufacturing industries. This paper takes fashion industry as an example and discusses the robust order scheduling problem in the fashion industry. In the fashion industry, order scheduling focuses on the assignment of production orders to appropriate production lines. In reality, before a new order can be put into production, a series of activities known as preproduction events need to be completed. In addition, in real production process, owing to various uncertainties, the daily production quantity of each order is not always as expected. In this paper, by considering the preproduction events and the uncertainties in the daily production quantity, robust order scheduling problems in the fashion industry are investigated with the aid of a multiobjective evolutionary algorithm called nondominated sorting adaptive differential evolution (NSJADE). The experimental results illustrate that it is of paramount importance to consider preproduction events in order scheduling problems in the fashion industry. We also unveil that the existence of the uncertainties in the daily production quantity heavily affects the order scheduling.

  • 111.
    Chatterjee, Santanu
    et al.
    Research Center Imarat, Defence Research and Development Organization, Hyderabad.
    Roy, Sandip
    Department of Computer Science and Engineering, Asansol Engineering College, Asansol.
    Kumar Das, Ashok
    Center for Security, Theory and Algorithmic Research, International Institute of Information Technology, Hyderabad.
    Chattopadhyay, Samiran
    Department of Information Technology, Jadavpur University, Salt Lake City.
    Kumar, Neeraj
    Department of Computer Science and Engineering, Thapar University, Patiala.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Secure Biometric-Based Authentication Schemeusing Chebyshev Chaotic Map for Multi-Server Environment2018In: IEEE Transactions on Dependable and Secure Computing, ISSN 1545-5971, E-ISSN 1941-0018, Vol. 15, no 5, p. 824-839Article in journal (Refereed)
    Abstract [en]

    Abstract: Multi-server environment is the most common scenario for a large number of enterprise class applications. In this environment, user registration at each server is not recommended. Using multi-server authentication architecture, user can manage authentication to various servers using single identity and password. We introduce a new authentication scheme for multi-server environments using Chebyshev chaotic map. In our scheme, we use the Chebyshev chaotic map and biometric verification along with password verification for authorization and access to various application servers. The proposed scheme is light-weight compared to other related schemes. We only use the Chebyshev chaotic map, cryptographic hash function and symmetric key encryption-decryption in the proposed scheme. Our scheme provides strong authentication, and also supports biometrics & password change phase by a legitimate user at any time locally, and dynamic server addition phase. We perform the formal security verification using the broadly-accepted AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that the presented scheme is secure. In addition, we use the formal security analysis using the Burrows-Abadi-Needham (BAN) logic along with random oracle models and prove that our scheme is secure against different known attacks. High security and significantly low computation and communication costs make our scheme is very suitable for multi-server environments as compared to other existing related schemes.

  • 112.
    Andersson, Karl
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    You, Ilsun
    Soonchunhyang University, Chungcheongnam-do, Republic of Korea.
    Palmieri, Francesco
    University of Salerno, Fisciano (SA), Italy.
    Security and Privacy for Smart, Connected, and Mobile IoT Devices and Platforms2018In: Security and Communication Networks, ISSN 1939-0114, E-ISSN 1939-0122, Vol. 2018, p. 1-2, article id 5346596Article in journal (Refereed)
  • 113.
    Schmidt, Mischa
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science. NEC Laboratories Europe.
    Åhlund, Christer
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Smart Buildings as Cyber-Physical Systems:Data-Driven Predictive Control Strategies for Energy Efficiency2018In: Renewable & sustainable energy reviews, ISSN 1364-0321, E-ISSN 1879-0690, Vol. 90, p. 742-756Article in journal (Refereed)
    Abstract [en]

    Due to its significant contribution to global energy usage and the associated greenhouse gas emissions, existing buildingstock’s energy efficiency must improve. Predictive building control promises to contribute to that by increasing theefficiency of building operations. Predictive control complements other means to increase performance such as refurbishmentsas well as modernizations of systems. This survey reviews recent works and contextualizes these with thecurrent state of the art of interrelated topics in data handling, building automation, distributed control, and semantics.The comprehensive overview leads to seven research questions guiding future research directions.

  • 114.
    Bai, Emilien
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Synnes, Kåre
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Supporting Collaborative Care of Elderly through a Reward System based on Distributed Ledger Technologies2018In: International Journal On Advances in Life Sciences, ISSN 1942-2660, E-ISSN 1942-2660, Vol. 10, no 1-2, p. 90-102Article in journal (Refereed)
    Abstract [en]

    This paper discusses supporting collaborative care of elderly through a reward system based on distributed ledger technologies. The design and implementation of such a reward system that connect elderly and volunteers by mutual agreements involve technologies such as smart contracts and blockchains. The work is motivated by the demographic change, where an aging population consequently increases the need for care. This causes a great tension in our society, as care resources become increasingly constrained, both regarding costs and availability of care staff. Much of the daily care of the elderly is today done by family members (spouses, children) and friends, often on a voluntarily basis, which adds to the tension. The core idea of this work is to help broaden the involvement of people in caring for our elderly, enabled by a system for collaborative care. The proposed system benefits from recent advances in distributed ledger technologies, which similarly to digital currencies, are build on the ability for mutual agreements between people who do not know each other. The system also benefits from recent gamification techniques to motivate people to collaborate on a larger scale through performing simple daily tasks. The proposed system benefits from inherent distributed ledger technologies advantages, such as a high level of decentralization, thus a high availability, and strong data consistency. These advantages make it interesting to develop the possible links between blockchains and the outside world to allow for a higher level of automation and distribution of services such as collaborative care. New models for distributed ledger technologies, such as Iota tangles or the Swirld platform, may however scale and perform better than blockchains. These should thus be considered for a full implementation and test of the system. In summary, this paper presents a novel framework and prototype implementation of a reward system supporting collaborative care of elderly, that is based on distributed ledger technologies.

  • 115.
    Lindqvist, Anna-Karin
    et al.
    Luleå University of Technology, Department of Health Sciences, Health and Rehabilitation.
    Castelli, Darla
    Kinetic Kidz Lab, Department of Kinesiology and Health Education, University of Texas at Austin, Austin, TX, United States. .
    Hallberg, Josef
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Rutberg, Stina
    Luleå University of Technology, Department of Health Sciences, Health and Rehabilitation.
    The Praise and Price of Pokémon GO: A Qualitative Study of Children's and Parents' Experiences.2018In: JMIR Serious Games, E-ISSN 2291-9279, Vol. 6, no 1, article id e1Article in journal (Refereed)
    Abstract [en]

    Background: Physical activity has multiple health benefits; however, the majority of children around the world do not attain the recommended levels of daily physical activity. Research has shown that the game Pokémon GO has increased the amount of physical activity of players and that the game has the potential to reach populations that traditionally have low levels of physical activity. Therefore, there is a need to understand which game components can promote initial and sustained physical activity. By using a qualitative research approach, it is possible to achieve rich descriptions and enhance a deep understanding of the components promoting physical activity among children in a game such as Pokémon GO.

    Objective: The objective of this study was to explore children’s and parents’ experiences playing Pokémon GO.

    Methods: Eight families comprising 13 children (aged 7-12 years) and 9 parents were selected using purposeful sampling. Data collected using focus groups were analyzed using qualitative latent content analysis.

    Results: The following three themes were revealed: (1) exciting and enjoyable exploration; (2) dangers and disadvantages; and (3) cooperation conquers competition. The first centers around the present and possible future aspects of Pokémon GO that promote physical activity. The second focuses on unwanted aspects and specific threats to safety when playing the game. The third shows that cooperation and togetherness are highly valued by the participants and that competition is fun but less important.

    Conclusions: Components from Pokémon GO could enhance the efficacy of physical activity interventions. Cooperation and exploration are aspects of the game that preferably could be transferred into interventions aimed at promoting children’s physical activity.

  • 116.
    Dirin, Amir
    et al.
    Business Information Technology, Haaga-Helia University of Applied Science, Finland .
    Laine, Teemu H.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Towards an Adaptive Study Management Platform: Freedom Through Personalization2018In: CSEDU 2018: Proceedings of the 10th International Conference on Computer Supported Education, SciTePress, 2018, Vol. 1, p. 432-439Conference paper (Refereed)
    Abstract [en]

    Technological advancements have brought abundant freedom to our lives. In an educational context, however, the technology utilization is still relatively low despite recent developments on various learning platforms such as e-learning, mobile learning, MOOCs, and social networks. The contemporary technological advancement in smart gadgets enables us to bring learning resources with appropriate content format to the learners at the right time in the right learning situation. Yet there remains a need for an adaptive study management solution that would apply data mining algorithms to assist university students both before and during their studies in a personalized manner. This assistance can be of many kinds, such as campus orientation to new students, course curriculum recommendations, and customization of study paths. In this paper, we present the concept and an initial implementation the Adaptive Study Management (ASM) platform that aims at facilitating a university student’s academi c life in different phases by tracing the student’s activities and providing personalized services, such as a course curriculum recommendation, based on their behavior and achievements during a period. The ASM platform creates a profile for the student based on their achievements and competencies. Consequently, the platform aims to grant freedom to students on their study management, eases teachers’ workloads on assessing students’ performance, and assists teachers and administrators to follow up students and dropouts. The goal of this platform to increase graduation rates by personalizing study management and providing analysis services, such as dropout prediction.

  • 117.
    Meng, Weizhi
    et al.
    Department of Applied Mathematics and Computer Science, Technical University of Denmark, Denmark.
    Raymond Choo, Kim-Kwang
    Department of Information Systems and Cyber Security and the Department of Electrical and Computer Engineering, The University of Texas at San Antonio, San Antonio, United States.
    Furnell, Steven
    School of Computing, Electronics and Mathematics, Plymouth University, United Kindom.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Probst, Christian W.
    Unitec Institute of Technology, New Zealand.
    Towards Bayesian-based Trust Management for Insider Attacks in Healthcare Software-Defined Networks2018In: IEEE Transactions on Network and Service Management, ISSN 1932-4537, E-ISSN 1932-4537, Vol. 15, no 2, p. 761-773Article in journal (Refereed)
    Abstract [en]

    The medical industry is increasingly digitalized and Internet-connected (e.g., Internet of Medical Things), and when deployed in an Internet of Medical Things environment, software-defined networks (SDN) allow the decoupling of network control from the data plane. There is no debate among security experts that the security of Internet-enabled medical devices is crucial, and an ongoing threat vector is insider attacks. In this paper, we focus on the identification of insider attacks in healthcare SDNs. Specifically, we survey stakeholders from 12 healthcare organizations (i.e., two hospitals and two clinics in Hong Kong, two hospitals and two clinics in Singapore, and two hospitals and two clinics in China). Based on the survey findings, we develop a trust-based approach based on Bayesian inference to figure out malicious devices in a healthcare environment. Experimental results in either a simulated and a real-world network environment demonstrate the feasibility and effectiveness of our proposed approach regarding the detection of malicious healthcare devices, i.e., our approach could decrease the trust values of malicious devices faster than similar approaches.

  • 118.
    Karvonen, Niklas
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Unobtrusive Activity Recognition in Resource-Constrained Environments2018Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    This thesis discusses activity recognition from a perspective of unobtrusiveness, where devices are worn or placed in the environment without being stigmatising or in the way. The research focuses on performing unobtrusive activity recognition when computational and sensing resources are scarce. This includes investigating unobtrusive ways to gather data, as well as adapting data modelling and classification to small, resource-constrained, devices.

    The work presents different aspects of data collection and data modelling when only using unobtrusive sensing. This is achieved by considering how different sensor placements affects prediction performance and how activity models can be created when using a single sensor, or when using a number of simple binary sensors, to perform movement analysis, recognise everyday activities, and perform stress detection. The work also investigates how classification can be performed on resource-constrained devices, resulting in a novel computation-efficient classifier and an efficient hand-made classification model. The work finally sets unobtrusive activity recognition into real-life contexts where it can be used for interventions to reduce stress, sedentary behaviour and symptoms of dementia.

    The results indicate that activities can be recognised unobtrusively and that classification can be performed even on resource-constrained devices. This allows for monitoring a user’s activities over extensive periods, which could be used for creating highly personal digital interventions and in-time advice that help users make positive behaviour changes. Such digital health interventions based on unobtrusive activity recognition for resource-constrained environments are important for addressing societal challenges of today, such as sedentary behaviour, stress, obesity, and chronic diseases. The final conclusion is that unobtrusive activity recognition is a cornerstone necessary for bringing many digital health interventions into a wider use.

  • 119.
    Dirin, Amir
    et al.
    Business Information Technology, Haaga-Helia University of Applied Sciences, Hels.
    Laine, Teemu H.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    User Experience in Mobile Augmented Reality: Emotions, Challenges, Opportunities and Best Practices2018In: Computers, E-ISSN 2073-431X, Vol. 7, no 2, article id 33Article in journal (Refereed)
    Abstract [en]

    Mobile Augmented Reality (MAR) is gaining a strong momentum to become a major interactive technology that can be applied across domains and purposes. The rapid proliferation of MAR applications in global mobile application markets has been fueled by a range of freely-available MAR software development kits and content development tools, some of which enable the creation of MAR applications even without programming skills. Despite the recent advances of MAR technology and tools, there are still many challenges associated with MAR from the User Experience (UX) design perspective. In this study, we first define UX as the emotions that the user encounters while using a service, a product or an application and then explore the recent research on the topic. We present two case studies, a commercial MAR experience and our own Virtual Campus Tour MAR application, and evaluate them from the UX perspective, with a focus on emotions. Next, we synthesize the findings from previous research and the results of the case study evaluations to form sets of challenges, opportunities and best practices related to UX design of MAR applications. Based on the identified best practices, we finally present an updated version of the Virtual Campus Tour. The results can be used for improving UX design of future MAR applications, thus making them emotionally engaging.

  • 120.
    Lin, Di
    et al.
    School of Information and Software Engineering, University of Electronic Science and Technology of Chengdu, 610051, China.
    Tang, Yu
    School of Information and Software Engineering, University of Electronic Science and Technology of Chengdu, 610051, China.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    User-Priority-Based Power Control in D2D Networks for Mobile Health2018In: IEEE Systems Journal, ISSN 1932-8184, E-ISSN 1937-9234, Vol. 12, no 4, p. 3142-3150, article id 4267003Article in journal (Refereed)
    Abstract [en]

    A device-to-device (D2D) assisted cellular network is pervasive to support ubiquitous healthcare applications, since it is expected to bring the significant benefits of improving user throughput, extending the battery life of mobiles, etc. However, D2D and cellular communications in the same network may cause cross-tier interference (CTI) to each other. Also a critical issue of using D2D assisted cellular networks under a healthcare scenario is the electromagnetic interference (EMI) caused by RF transmission, and a high level of EMI may lead to a critical malfunction of medical equipments. In consideration of CTI and EMI, we study the problem of optimizing the energy efficiency (EE) across the mobile users in different priorities (different levels of emergency) within the Internet of vehicles for mobile health, and propose a penalty-function algorithm of power control to solve the aforementioned problem. Numerical results demonstrate that the proposed algorithm can achieve remarkable improvements in terms of EE, while ensuring an allowable level of EMI on medical equipments.

  • 121.
    Kikhia, Basel
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Stavropoulos, Thanos G.
    Information Technologies Institute, Centre for Research & Technology Hellas.
    Meditskos, Georgios
    Information Technologies Institute, Centre for Research & Technology Hellas.
    Kompatsiaris, Ioannis
    Information Technologies Institute, Centre for Research & Technology Hellas.
    Hallberg, Josef
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Sävenstedt, Stefan
    Luleå University of Technology, Department of Health Sciences, Nursing Care.
    Melander, Catharina
    Luleå University of Technology, Department of Health Sciences, Nursing Care.
    Utilizing ambient and wearable sensors to monitor sleep and stress for people with BPSD in nursing homes2018In: Journal of Ambient Intelligence and Humanized Computing, ISSN 1868-5137, E-ISSN 1868-5145, Vol. 9, no 2, p. 261-273Article in journal (Refereed)
    Abstract [en]

    Clinical assessment of behavioral and psychological symptoms of dementia (BPSD) in nursing homes is often based on staff member’s observations and the use of the Neuropsychiatric Inventory-Nursing Home version (NPI-NH) instrument. This requires continuous observation of the person with BPSD, and a lot of effort and manual input from the nursing home staff. This article presents the DemaWare@NH monitoring framework system, which complements traditional methods in measuring patterns of behavior, namely sleep and stress, for people with BPSD in nursing homes. The framework relies on ambient and wearable sensors for observing the users and analytics to assess their conditions. In our proof-of-concept scenario, four residents from two nursing homes were equipped with sleep and skin sensors, whose data is retrieved, processed and analyzed by the framework, detecting and highlighting behavioral problems, and providing relevant, accurate information to clinicians on sleep and stress patterns. The results indicate that structured information from sensors can ease and improve the understanding of behavioral patterns, and, as a consequence, the efficiency of care interventions, yielding a positive impact on the quality of the clinical assessment process for people with BPSD in nursing homes.

  • 122.
    Adalat, Mohsin
    et al.
    COSMOSE Research Group, Department of Computer Science, COMSATS University Islamabad, Islamabad, Pakistan.
    Niazi, Muaz A.
    COSMOSE Research Group, Department of Computer Science, COMSATS University Islamabad, Islamabad, Pakistan.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Variations in power of opinion leaders in online communication networks2018In: Royal Society Open Science, E-ISSN 2054-5703, Vol. 5, no 10, article id 180642Article in journal (Refereed)
    Abstract [en]

    Online social media has completely transformed how we communicate with each other. While online discussion platforms are available in the form of applications and websites, an emergent outcome of this transformation is the phenomenon of ‘opinion leaders’. A number of previous studies have been presented to identify opinion leaders in online discussion networks. In particular, Feng (2016 Comput. Hum. Behav. 54, 43–53. (doi:10.1016/j.chb.2015.07.052)) has identified five different types of central users besides outlining their communication patterns in an online communication network. However, the presented work focuses on a limited time span. The question remains as to whether similar communication patterns exist that will stand the test of time over longer periods. Here, we present a critical analysis of the Feng framework both for short-term as well as for longer periods. Additionally, for validation, we take another case study presented by Udanor et al. (2016 Program 50, 481–507. (doi:10.1108/PROG-02-2016-0011)) to further understand these dynamics. Results indicate that not all Feng-based central users may be identifiable in the longer term. Conversation starter and influencers were noted as opinion leaders in the network. These users play an important role as information sources in long-term discussions. Whereas network builder and active engager help in connecting otherwise sparse communities. Furthermore, we discuss the changing positions of opinion leaders and their power to keep isolates interested in an online discussion network.

  • 123.
    Zhang, Changsen
    et al.
    College of Computer Science and Technology, Henan Polytechnic University, Jiaozuo, Henan.
    Chen, Pengpeng
    College of Computer Science and Technology, Henan Polytechnic University, Jiaozuo, Henan.
    Ren, Jianji
    College of Computer Science and Technology, Henan Polytechnic University, Jiaozuo, Henan.
    Wang, Xiaofeng
    Department of Electrical and Computer Engineering, The University of British Columbia, Vancouver.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    A backoff algorithm based on self-adaptive contention window update factor for IEEE 802.11 DCF2017In: Wireless networks, ISSN 1022-0038, E-ISSN 1572-8196, Vol. 23, no 3, p. 749-758Article in journal (Refereed)
    Abstract [en]

    The binary exponential backoff (BEB) mechanism is applied to the packet retransmission in lots of wireless network protocols including IEEE 802.11 and 802.15.4. In distributed dynamic network environments, the fixed contention window (CW) updating factor of BEB mechanism can’t adapt to the variety of network size properly, resulting in serious collisions. To solve this problem, this paper proposes a backoff algorithm based on self-adaptive contention window update factor for IEEE 802.11 DCF. In WLANs, this proposed backoff algorithm can greatly enhance the throughput by setting the optimal CW updating factor according to the theoretical analysis. When the number of active nodes varies, an intelligent scheme can adaptively adjust the CW updating factor to achieve the maximal throughput during run time. As a result, it effectively reduces the number of collisions, improves the channel utilization and retains the advantages of the binary exponential back-off algorithm, such as simplicity and zero cost. In IEEE 802.11 distributed coordination function (DCF) protocol, the numerical analysis of physical layer parameters show that the new backoff algorithm performance is much better than BEB, MIMD and MMS algorithm.

  • 124.
    Palm, Emanuel
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Mitra, Karan
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Saguna, Saguna
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Åhlund, Christer
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    A Bayesian system for cloud performance diagnosis and prediction2017In: Proceedings of the International Conference on Cloud Computing Technology and Science, CloudCom, Piscataway, NJ: IEEE Computer Society, 2017, p. 371-374, article id 7830706Conference paper (Refereed)
    Abstract [en]

    The stochastic nature of the cloud systems makes cloud quality of service (QoS) performance diagnosis and prediction a challenging task. A plethora of factors including virtual machine types, data centre regions, CPU types, time-of-the-day, and day-of-the-week contribute to the variability of the cloud QoS. The state-of-the-art methods for cloud performance diagnosis do not capture and model complex and uncertain inter-dependencies between these factors for efficient cloud QoS diagnosis and prediction. This paper presents ALPINE, a proof-of-concept system based on Bayesian networks. Using a real-life dataset, we demonstrate that ALPINE can be utilised for efficient cloud QoS diagnosis and prediction under stochastic cloud conditions

  • 125.
    Hossain, Mohammad Shahadat
    et al.
    University of Chittagong, Bangladesh.
    Rahaman, Saifur
    Department of Computer Science and Engineering, International Islamic University Chittagong.
    Kor, Ah-Lian
    School of Computing, Creative Technologies and Engineering, Leeds Beckett University.
    Andersson, Karl
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Pattison, Colin
    School of Computing, Creative Technologies and Engineering, Leeds Beckett University.
    A Belief Rule Based Expert System for Datacenter PUE Prediction under Uncertainty2017In: IEEE Transactions on Sustainable Computing, ISSN 2377-3782, Vol. 2, no 2, p. 140-153Article in journal (Refereed)
    Abstract [en]

    A rapidly emerging trend in the IT landscape is the uptake of large-scale datacenters moving storage and data processing to providers located far away from the end-users or locally deployed servers. For these large-scale datacenters, power efficiency is a key metric, with the PUE (Power Usage Effectiveness) and DCiE (Data Centre infrastructure Efficiency) being important examples. This article proposes a belief rule based expert system to predict datacenter PUE under uncertainty. The system has been evaluated using real-world data from a data center in the UK. The results would help planning construction of new datacenters and the redesign of existing datacenters making them more power efficient leading to a more sustainable computing environment. In addition, an optimal learning model for the BRBES demonstrated which has been compared with ANN and Genetic Algorithm; and the results are promising.

  • 126.
    Hossain, Mohammad Shahadat
    et al.
    University of Chittagong, Bangladesh.
    Ahmed, Faisal
    University of Chittagong, Bangladesh.
    Tuj-Johora, Fatema
    University of Chittagong, Bangladesh.
    Andersson, Karl
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    A Belief Rule Based Expert System to Assess Tuberculosis under Uncertainty2017In: Journal of medical systems, ISSN 0148-5598, E-ISSN 1573-689X, Vol. 41, no 3, article id 43Article in journal (Refereed)
    Abstract [en]

    The primary diagnosis of Tuberculosis (TB) is usually carried out by looking at the various signs and symptoms of a patient. However, these signs and symptoms cannot be measured with 100\% certainty since they are associated with various types of uncertainties such as vagueness, imprecision, randomness, ignorance and incompleteness. Consequently, traditional primary diagnosis, based on these signs and symptoms, which is carried out by the physicians, cannot deliver reliable results. Therefore, this article presents the design, development and applications of a Belief Rule Based Expert System (BRBES) with the ability to handle various types of uncertainties to diagnose TB. The knowledge base of this system is constructed by taking experts' suggestions and by analyzing historical data of TB patients. The experiments, carried out, by taking the data of 100 patients demonstrate that the BRBES's generated results are more reliable than that of human expert as well as fuzzy rule based expert system. 

  • 127.
    Hossain, Mohammad Shahadat
    et al.
    University of Chittagong, Bangladesh.
    Habib, Israt Binteh
    University of Chittagong.
    Andersson, Karl
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    A Belief Rule based Expert System to Diagnose Dengue Fever under Uncertainty2017In: Proceedings of Computing Conference 2017 / [ed] Liming Chen, Nikola Serbedzija, Kami Makki, Nazih Khaddaj Mallat, Kohei Arai, Piscataway, NJ: Institute of Electrical and Electronics Engineers (IEEE), 2017, p. 179-186Conference paper (Refereed)
    Abstract [en]

    Dengue Fever is a debilitating mosquito-borne disease, causing sudden fever, leading to fatality in many cases. A Dengue patient is diagnosed by the physicians by looking at the various signs, symptoms and risk factors of this disease. However, these signs, symptoms and the risk factors cannot be measured with 100% certainty since various types of uncertainties such as imprecision, vagueness, ambiguity, and ignorance are associated with them. Hence, it is difficult for the physicians to diagnose the dengue patient accurately since they don’t consider the uncertainties as mentioned. Therefore, this paper presents the design, development and applications of an expert system by incorporating belief rule base as the knowledge representation schema as well as the evidential reasoning as the inference mechanism with the capability of handling various types of uncertainties to diagnose dengue fever. The results generated from the expert system are more reliable than from fuzzy rule based system or from human expert.

  • 128.
    Wan, Jiafu
    et al.
    Guangdong Provincial Key Laboratory of Precision Equipment and Manufacturing Technology, South China University of Technology.
    Tang, Shenglong
    School of Mechanical and Automotive Engineering, South China University of Technology.
    Li, Di
    School of Mechanical and Automotive Engineering, South China University of Technology.
    Wang, Shiyong
    School of Mechanical and Automotive Engineering, South China University of Technology.
    Liu, Chengliang
    School of Mechanical Engineering, Shanghai Jiao Tong University.
    Abbas, Haider
    Center of Excellence in Information Assurance, King Saud University.
    Vasilakos, Athanasios V.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    A Manufacturing Big Data Solution for Active Preventive Maintenance2017In: IEEE Transactions on Industrial Informatics, ISSN 1551-3203, E-ISSN 1941-0050, Vol. 13, no 4, p. 2039-2047, article id 7857790Article in journal (Refereed)
    Abstract [en]

    Industry 4.0 has become more popular due to recent developments in Cyber-Physical Systems (CPS), big data, cloud computing, and industrial wireless networks. Intelligent manufacturing has produced a revolutionary change, and evolving applications such as product lifecycle management are becoming a reality. In this paper, we propose and implement a manufacturing big data solution for active preventive maintenance in manufacturing environments. First, we provide the system architecture that is used for active preventive maintenance. Then, we analyze the method used for collection of manufacturing big data according to the data characteristics. Subsequently, we perform data processing in the cloud, including the cloud layer architecture, the real-time active maintenance mechanism, and the off-line prediction and analysis method. Finally, we analyze a prototype platform and implement experiments to compare the traditionally-used method with the proposed active preventive maintenance method. The manufacturing big data method used for active preventive maintenance has the potential to accelerate implementation of Industry 4.0.

  • 129.
    D'Orazio, Christian Javier
    et al.
    School of Information Technology and Mathematical Sciences, University of South Australia, Australia.
    Rongxing, Lu
    Faculty of Computer Science, University of New Brunswick, Fredericton, NB, Canada.
    Choo, Kim Kwang Raymond
    School of Information Technology and Mathematical Sciences, University of South Australia, Australia.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    A Markov adversary model to detect vulnerable iOS devices and vulnerabilities in iOS apps2017In: Applied Mathematics and Computation, ISSN 0096-3003, E-ISSN 1873-5649, Vol. 293, p. 523-544Article in journal (Refereed)
    Abstract [en]

    With the increased convergence of technologies whereby a user can access, store and transmit data across different devices in real-time, risks will arise from factors such as lack of appropriate security measures in place and users not having requisite levels of security awareness and not fully understanding how security measures can be used to their advantage. In this paper, we adapt our previously published adversary model for digital rights management (DRM) apps and demonstrate how it can be used to detect vulnerable iOS devices and to analyse (non-DRM) apps for vulnerabilities that can potentially be exploited. Using our adversary model, we investigate several (jailbroken and non-jailbroken) iOS devices, Australian Government Medicare Expert Plus (MEP) app, Commonwealth Bank of Australia app, Western Union app, PayPal app, PocketCloud Remote Desktop app and Simple Transfer Pro app, and reveal previously unknown vulnerabilities. We then demonstrate how the identified vulnerabilities can be exploited to expose the user's sensitive data and personally identifiable information stored on or transmitted from the device. We conclude with several recommendations to enhance the security and privacy of user data stored on or transmitted from these devices.

  • 130.
    Kumar, Neeraj
    et al.
    Thapar University, Department of Computer Science & Engineering, Patiala, Punjab.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Rodrigues, Joel J.P.C.
    University of Beira Interior.
    A Multi-Tenant Cloud-Based DC Nano Grid for Self-Sustained Smart Buildings in Smart Cities2017In: IEEE Communications Magazine, ISSN 0163-6804, E-ISSN 1558-1896, Vol. 55, no 3, p. 14-21, article id 7876851Article in journal (Refereed)
    Abstract [en]

    Energy is one of the most valuable resources of the modern era and needs to be consumed in an optimized manner by an intelligent usage of various smart devices, which are major sources of energy consumption nowadays. With the popularity of low-voltage DC appliances such as-LEDs, computers, and laptops, there arises a need to design new solutions for self-sustainable smart energy buildings containing these appliances. These smart buildings constitute the next generation smart cities. Keeping focus on these points, this article proposes a cloud-assisted DC nanogrid for self-sustainable smart buildings in next generation smart cities. As there may be a large number of such smart buildings in different smart cities in the near future, a huge amount of data with respect to demand and generation of electricity is expected to be generated from all such buildings. This data would be of heterogeneous types as it would be generated from different types of appliances in these smart buildings. To handle this situation, we have used a cloud-based infrastructure to make intelligent decisions with respect to the energy usage of various appliances. This results in an uninterrupted DC power supply to all low-voltage DC appliances with minimal dependence on the grid. Hence, the extra burden on the main grid in peak hours is reduced as buildings in smart cities would be self-sustainable with respect to their energy demands

  • 131.
    Li, Yaoxing
    et al.
    Institute of Operating System and Computer Networks, Technische Universität Braunschweig.
    Li, Yuhong
    State Key Laboratory of Networking and Switching Technology, Beijing University of Posts and Telecommunications.
    Wolf, Lars C.
    Institute of Operating System and Computer Networks, Technische Universität Braunschweig.
    Lindgren, Anders
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Wang, Ji
    State Key Laboratory of Networking and Switching Technology, Beijing University of Posts and Telecommunications.
    A named data approach for DTN routing2017In: 2017 Wireless Days, WD 2017 / [ed] Salgado H.,Ruela J.,Pessoa L.,Teixeira F.,Ricardo M.,Campos R.,Morla R, Piscataway, NJ: Institute of Electrical and Electronics Engineers (IEEE), 2017, p. 163-166, article id 7918135Conference paper (Refereed)
    Abstract [en]

    Many common DTN routing protocols are replication-based, which have relatively good performance in terms of message delivery ratio but high overhead, and leave the issue of garbage collection open. In this paper, we propose Named Data Distance Routing (NDDR), a named data based DTN routing approach which makes routing decisions for named data based on topological distance information. This helps to reduce the overhead of routing. We have implemented NDDR in the ONE simulation environment and the simulation results show that the proposed routing method has better performance in terms of message delivery ratio and network overhead compared with several typical replication-based DTN routing protocols

  • 132.
    Jing, Xu
    et al.
    College of Information Engineering, Northwest A & F University, Yangling.
    Hu, Hanwen
    College of Information Engineering, Northwest A & F University, Yangling.
    Yang, Huijun
    College of Information Engineering, Northwest A & F University, Yangling.
    Au, Man Ho
    Department of Computing, The Hong Kong Polytechnic University.
    Li, Shuqin
    College of Information Engineering, Northwest A & F University.
    Xiong, Naixue
    Department of Mathematics and Computer Science, Northeastern State University, Tahlequah, OK.
    Imran, Muhammad
    College of Computer and Information Sciences, King Saud University.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    A Quantitative Risk Assessment Model Involving Frequency and Threat Degree under Line-of-Business Services for Infrastructure of Emerging Sensor Networks2017In: Sensors, ISSN 1424-8220, E-ISSN 1424-8220, Vol. 17, no 3, article id 642Article in journal (Refereed)
    Abstract [en]

    The prospect of Line-of-Business Services (LoBSs) for infrastructure of Emerging Sensor Networks (ESNs) is exciting. Access control remains a top challenge in this scenario as the service provider's server contains a lot of valuable resources. LoBSs' users are very diverse as they may come from a wide range of locations with vastly different characteristics. Cost of joining could be low and in many cases, intruders are eligible users conducting malicious actions. As a result, user access should be adjusted dynamically. Assessing LoBSs' risk dynamically based on both frequency and threat degree of malicious operations is therefore necessary. In this paper, we proposed a Quantitative Risk Assessment Model (QRAM) involving frequency and threat degree based on value at risk. To quantify the threat degree as an elementary intrusion effort, we amend the influence coefficient of risk indexes in the network security situation assessment model. To quantify threat frequency as intrusion trace effort, we make use of multiple behavior information fusion. Under the influence of intrusion trace, we adapt the historical simulation method of value at risk to dynamically access LoBSs' risk. Simulation based on existing data is used to select appropriate parameters for QRAM. Our simulation results show that the duration influence on elementary intrusion effort is reasonable when the normalized parameter is 1000. Likewise, the time window of intrusion trace and the weight between objective risk and subjective risk can be set to 10 s and 0.5, respectively. While our focus is to develop QRAM for assessing the risk of LoBSs for infrastructure of ESNs dynamically involving frequency and threat degree, we believe it is also appropriate for other scenarios in cloud computing.

  • 133.
    Li, Xiaomin
    et al.
    School of Mechanical and Automotive Engineering, South China University of Technology, Guangzhou.
    Li, Di
    School of Mechanical and Automotive Engineering, South China University of Technology, Guangzhou.
    Wan, Jiafu
    School of Mechanical and Automotive Engineering, South China University of Technology, Guangzhou.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Lai, Chinfeng
    Department of Computer Science and Information Engineering, National Chung Cheng University, Jiayi.
    Wang, Shiyong
    School of Mechanical and Automotive Engineering, South China University of Technology, Guangzhou.
    A review of industrial wireless networks in the context of Industry 4.02017In: Wireless networks, ISSN 1022-0038, E-ISSN 1572-8196, Vol. 23, no 1, p. 23-41Article in journal (Refereed)
    Abstract [en]

    There have been many recent advances in wireless communication technologies, particularly in the area of wireless sensor networks, which have undergone rapid development and been successfully applied in the consumer electronics market. Therefore, wireless networks (WNs) have been attracting more attention from academic communities and other domains. From an industrial perspective, WNs present many advantages including flexibility, low cost, easy deployment and so on. Therefore, WNs can play a vital role in the Industry 4.0 framework, and can be used for smart factories and intelligent manufacturing systems. In this paper, we present an overview of industrial WNs (IWNs), discuss IWN features and related techniques, and then provide a new architecture based on quality of service and quality of data for IWNs. We also propose some applications for IWNs and IWN standards. Then, we will use a case from our previous achievements to explain how to design an IWN under Industry 4.0. Finally, we highlight some of the design challenges and open issues that still need to be addressed to make IWNs truly ubiquitous for a wide range of applications. 

  • 134.
    Bai, Emilien
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Synnes, Kåre
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    A Reward System for Collaborative Care of Elderly based on Distributed Ledger Technologies2017In: Proceedings of the Eleventh International Conference on Mobile Ubiquitous Computing, Systems, Services and Technologies: UBICOMM 2017, 2017Conference paper (Refereed)
    Abstract [en]

    This paper presents the design and implementation of a reward system for collaborative care of elderly based on distributed ledger technologies. The work is motivated by the demographic change, where an aging population consequently increases the need for care. This causes a great tension in our society, as care resources become increasingly constrained, both regarding costs and availability of care staff. Much of the daily care of the elderly is today done by family members (spouses, children) and friends, often on a voluntarily basis, which adds to the tension. The core idea of this work is to help broaden the involvement of people in caring for our elderly, enabled by a system for collaborative care. The proposed system benefits from recent advances in distributed ledger technologies, which similarly to digital currencies, are build on the ability for mutual agreements between people who do not know each other. The system also benefits from recent gamification techniques to motivate people to collaborate on a larger scale through performing simple daily tasks. The system builds on rewards automatically given when these smart contracts are fulfilled, a gamification technique that is believed to maintain motivation of the volunteers. In this paper, we thus describe a reward system designed to connect elderly and volunteers by mutual agreements implemented as smart contracts. 

  • 135.
    Siddiquee, Kazy Noor E Alam
    et al.
    University of Science and Technology Chittagong.
    Andersson, Karl
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Khan, Faria Farjana
    University of Science and Technology Chittagong, Bangladesh.
    Hossain, Mohammad Shahadat
    University of Chittagong, Bangladesh.
    A Scalable and Secure MANET for an i-Voting System2017In: Journal of Wireless Mobile Networks, Ubiquitous Computing, and Dependable Applications, ISSN 2093-5374, E-ISSN 2093-5382, Vol. 8, no 3, p. 1-17, article id 1Article in journal (Refereed)
    Abstract [en]

    Internet Voting (i-Voting) is an online electronic voting process where a voter can vote staying online from anywhere or connected to a wireless network of a target place. In this paper, a wireless network built with a MANET has been considered for the voting process. National parliamentary voting process of Bangladesh has been taken as the case study. The MANET of the voting process is built using some stationary wireless nodes and mobile wireless nodes. Voters carry mobile wireless nodes using which they can vote. Stationary wireless nodes are installed and deployed in the MANET built in a polling area selected by the National Agency of Election process. These nodes are directly in connection with the national database of voters. Stationary nodes perform the authentication and validation processes of the voter (a mobile node) before the vote is given and casted. The secured transaction of data is the goal to be occurred and routed after a strong authentication and validation of the user has been confirmed. The whole process is completed in a scalable wireless network with a distributed goal based approach. Total processes are followed by secured routing of data in this MANET. The optimal routing protocol among OLSR, AODV, DSR, TORA and GRP has been chosen. Denial of Service (DoS) attacks have been considered as the major threat on nodes in this MANET. The simulation work is done in the OPNET simulator.

  • 136.
    Ye, Dayong
    et al.
    School of Software and Electrical Engineering, Swinburne University of Technology, Melbourne.
    Zhang, Minjie
    School of Computer Science and Software Engineering, University of Wollongong.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    A Survey of Self-Organization Mechanisms in Multiagent Systems2017In: IEEE Transactions on Systems, Man, and Cybernetics: Systems, ISSN 2168-2216, Vol. 47, no 3, p. 441-461Article in journal (Refereed)
    Abstract [en]

    This paper surveys the literature over the last decades in the field of self-organizing multiagent systems. Self-organization has been extensively studied and applied in multiagent systems and other fields, e.g., sensor networks and grid systems. Self-organization mechanisms in other fields have been thoroughly surveyed. However, there has not been a survey of self-organization mechanisms developed for use in multiagent systems. In this paper, we provide a survey of existing literature on self-organization mechanisms in multiagent systems. We also highlight the future work on key research issues in multiagent systems. This paper can serve as a guide and a starting point for anyone who will conduct research on self-organization in multiagent systems. Also, this paper complements existing survey studies on self-organization in multiagent systems.

  • 137.
    Hridoy, Md Rafiul Sabbir
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Islam, Raihan Ul
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Hossain, Mohammad Shahadat
    Department of Computer Science and Engineering, University of Chittagong.
    Andersson, Karl
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    A Web Based Belief Rule Based Expert System for Assessing Flood Risk2017In: iiWAS'17: Proceedings of the 19th International Conference on Information Integration and Web-based Applications & Services, New York: ACM Digital Library, 2017, p. 434-440Conference paper (Refereed)
    Abstract [en]

    Natural calamities such as flooding, volcanic eruption, tornado hampers our daily life and causes many sufferings. Flood is one of the most catastrophic among the natural calamities. Assessing flood risk helps us to take necessary steps and save human lives. Several heterogeneous factors are used to assess flood risk on the livelihood of an area. Moreover, several types of uncertainties can be associated with each factor. In this paper, we propose a web based flood risk assessment expert system by combining belief rule base with the capability of reading data and generating web-based output. This paper also introduces a generic RESTful API which can be used without writing the belief rule based expert system from scratch. This expert system will facilitate the monitoring of the various flood risk factors, contributing in increasing the flood risk on livelihood of an area. Eventually, the decision makers should be able to take measures to control those factors and to reduce the risk of flooding in an area. Data for the expert system has been collected from a case study area by conducting interviews.

  • 138.
    Revadigar, Girish
    et al.
    School of Computer Science and Engineering, UNSW Australia.
    Javali, Chitra
    School of Computer Science and Engineering, UNSW Australia.
    Xu, Weitao
    School of Information Technology and Electrical Engineering, University of Queensland, Brisbane.
    Vasilakos, Athanasios V.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Hu, Wen
    School of Computer Science and Engineering UNSW Australia, Sydney.
    Jha, Sanjay
    School of Computer Science and Engineering UNSW Australia, Sydney.
    Accelerometer and Fuzzy Vault-Based Secure Group Key Generation and Sharing Protocol for Smart Wearables2017In: IEEE Transactions on Information Forensics and Security, ISSN 1556-6013, E-ISSN 1556-6021, Vol. 12, no 10, p. 2467-2482Article in journal (Refereed)
    Abstract [en]

    The increased usage of smart wearables in various applications, specifically in health-care, emphasizes the need for secure communication to transmit sensitive health-data. In a practical scenario, where multiple devices are carried by a person, a common secret key is essential for secure group communication. Group key generation and sharing among wearables has received very little attention in the literature due to the underlying challenges: (i) difficulty in obtaining a good source of randomness to generate strong cryptographic keys, and (ii) finding a common feature among all the devices to share the key. In this paper, we present a novel solution to generate and distribute group secret keys by exploiting on-board accelerometer sensor and the unique walking style of the user, i.e., gait. We propose a method to identify the suitable samples of accelerometer data during all routine activities of a subject to generate the keys with high entropy. In our scheme, the smartphone placed on waist employs fuzzy vault, a cryptographic construct, and utilizes the acceleration due to gait, a common characteristic extracted on all wearable devices to share the secret key. We implement our solution on commercially available off-the-shelf smart wearables, measure the system performance, and conduct experiments with multiple subjects. Our results demonstrate that the proposed solution has a bit rate of 750 bps, low system overhead, distributes the key securely and quickly to all legitimate devices, and is suitable for practical applications.

  • 139.
    Li, Xuran
    et al.
    Faculty of Information Technology, Macau University of Science and Technology.
    Dai, Hong-Ning
    Faculty of Information Technology, Macau University of Science and Technology.
    Wang, Qiu
    Faculty of Information Technology, Macau University of Science and Technology, Macau.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    AE-shelter: An novel anti-eavesdropping scheme in wireless networks2017In: IEEE International Conference on Communications, Piscataway, NJ: Institute of Electrical and Electronics Engineers (IEEE), 2017, article id 7996847Conference paper (Refereed)
    Abstract [en]

    To protect confidential communications from eavesdropping attacks in wireless networks, we propose a novel anti-eavesdropping scheme named AE-Shelter. In our proposed scheme, we place a number of friendly jammers at a circular boundary to protect legitimate communications. The jammers sending artificial noise can mitigate the eavesdropping capability of wiretapping the confidential information. We also establish a theoretical model to evaluate the performance of AE-Shelter. Our results show that our proposed AE-Shelter scheme can significantly reduce the eavesdropping risk without significantly degrading the network performance.

  • 140.
    Mitra, Karan
    et al.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Saguna, Saguna
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Åhlund, Christer
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Ranjan, Rajiv
    Newcastle University.
    ALPINE: A Bayesian System For Cloud Performance Diagnosis And Prediction2017In: 2017 IEEE International Conference on Services Computing (SCC), Piscataway, NJ: IEEE, 2017, p. 281-288, article id 8034996Conference paper (Refereed)
    Abstract [en]

    Cloud performance diagnosis and prediction is a challenging problem due to the stochastic nature of the cloud systems. Cloud performance is affected by a large set of factors such as virtual machine types, regions, workloads, wide area network delay and bandwidth. Therefore, necessitating the determination of complex relationships between these factors. The current research in this area does not address the challenge of modeling the uncertain and complex relationships between these factors. Further, the challenge of cloud performance prediction under uncertainty has not garnered sufficient attention. This paper proposes, develops and validates ALPINE, a Bayesian system for cloud performance diagnosis and prediction. ALPINE incorporates Bayesian networks to model uncertain and complex relationships between several factors mentioned above. It handles missing, scarce and sparse data to diagnose and predict stochastic cloud performance efficiently. We validate our proposed system using extensive real data and show that it predicts cloud performance with high accuracy of 91.93%.

  • 141.
    Challa, Srinavi
    et al.
    Center for Security, Theory and Algorithmic Research, International Institute of Information Technology, Hyderabad .
    Kumar Das, Ashok
    Center for Security, Theory and Algorithmic Research, International Institute of Information Technology, Hyderabad .
    Odelu, Vanga
    Department of Computer Science and Engineering, Indian Institute of Information Technology Chittoor.
    Kumar, Neeraj
    Department of Computer Science and Engineering, Thapar University, Patiala .
    Kumari, Sari
    Department of Mathematics, Ch. Charan Singh University, Meerut .
    Khan, Muhammad Khurram
    Center of Excellence in Information Assurance, King Saud University, Riyadh.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    An efficient ECC-based provably secure three-factor user authentication and key agreement protocol for wireless healthcare sensor networks2017In: Computers & electrical engineering, ISSN 0045-7906, E-ISSN 1879-0755, Vol. 69, p. 534-554Article in journal (Refereed)
    Abstract [en]

    We first show the security limitations of a recent user authentication scheme proposed for wireless healthcare sensor networks. We then present a provably secure three-factor user authentication and key agreement protocol for wireless healthcare sensor networks. The proposed scheme supports functionality features, such as dynamic sensor node addition, password as well as biometrics update, smart card revocation along with other usual features required for user authentication in wireless sensor networks. Our scheme is shown to be secure through the rigorous formal security analysis under the Real-Or-Random (ROR) model and broadly-accepted Burrows-Abadi-Needham (BAN) logic. Furthermore, the simulation through the widely-known Automated Validation of Internet Security Protocols and Applications (AVISPA) tool shows that our scheme is also secure. High security, and low communication and computation costs make our scheme more suitable for practical application in healthcare applications as compared to other related existing schemes.

  • 142.
    Jindal, Anish
    et al.
    CSE Department, Thapar University.
    Dua, Amit
    Department of Computer Science and Information Systems, BITS Pilani.
    Kumar, Neeraj
    CSE Department, Thapar University.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Rodrigues, Joel J.P.C.
    National Institute of Telecommunications (Inatel), Brazil.
    An efficient fuzzy rule-based big data analytics scheme for providing healthcare-as-a-service2017In: IEEE International Conference on Communications, Piscataway, NJ: Institute of Electrical and Electronics Engineers (IEEE), 2017, article id 7996965Conference paper (Refereed)
    Abstract [en]

    With advancements in information and communication technology (ICT), there is an increase in the number of users availing remote healthcare applications. The data collected about the patients in these applications varies with respect to volume, velocity, variety, veracity, and value. To process such a large collection of heterogeneous data is one of the biggest challenges that needs a specialized approach. To address this issue, a new fuzzy rule-based classifier for big data handling using cloud-based infrastructure is presented in this paper, with an aim to provide Healthcare-as-a-Service (HaaS) to the users located at remote locations. The proposed scheme is based upon the cluster formation using the modified Expectation-Maximization (EM) algorithm and processing of the big data on the cloud environment. Then, a fuzzy rule-based classifier is designed for an efficient decision making about the data classification in the proposed scheme. The proposed scheme is evaluated with respect to different evaluation metrics such as classification time, response time, accuracy and false positive rate. The results obtained are compared with the standard techniques to confirm the effectiveness of the proposed scheme.

  • 143.
    Agreste, Santa
    et al.
    Department of Mathematics and Computer Science, Physical Sciences and Earth Sciences, University of Messina.
    De Meo, Pasquale
    of Ancient and Modern Civilizations, University of Messina.
    Fiumara, Giacomo
    Department of Mathematics and Computer Science, Physical Sciences and Earth Sciences, University of Messina.
    Piccione, Giuseppe
    Department of Mathematics and Computer Science, Physical Sciences and Earth Sciences, University of Messina.
    Piccolo, Sebastiano
    Department of Management Engineering - Engineering Systems Division at the Technical University of Denmark.
    Rosaci, Domenico
    DIIES Department, University of Reggio Calabria Via Graziella.
    Sarné, Giuseppe M. L.
    DICEAM Department, University of Reggio Calabria Via Graziella.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    An empirical comparison of algorithms to find communities in directed graphs and their application in Web Data Analytics2017In: IEEE Transactions on Big Data, E-ISSN 2332-7790, Vol. 3, no 3, p. 289-306Article in journal (Refereed)
    Abstract [en]

    Detecting communities in graphs is a fundamental tool to understand the structure of Web-based systems and predict their evolution. Many community detection algorithms are designed to process undirected graphs (i.e., graphs with bidirectional edges) but many graphs on the Web - e.g. microblogging Web sites, trust networks or the Web graph itself - are often directed. Few community detection algorithms deal with directed graphs but we lack their experimental comparison. In this paper we evaluated some community detection algorithms across accuracy and scalability. A first group of algorithms (Label Propagation and Infomap) are explicitly designed to manage directed graphs while a second group (e.g., WalkTrap) simply ignores edge directionality; finally, a third group of algorithms (e.g., Eigenvector) maps input graphs onto undirected ones and extracts communities from the symmetrized version of the input graph. We ran our tests on both artificial and real graphs and, on artificial graphs, WalkTrap achieved the highest accuracy, closely followed by other algorithms; Label Propagation has outstanding performance in scalability on both artificial and real graphs. The Infomap algorithm showcased the best trade-off between accuracy and computational performance and, therefore, it has to be considered as a promising tool for Web Data Analytics purposes.

  • 144.
    Abedin, Md. Zainal
    et al.
    University of Science and Technology Chittagong.
    Chowdhury, Abu Sayeed
    University of Science and Technology Chittagong.
    Hossain, Mohammad Shahadat
    University of Chittagong, Bangladesh.
    Andersson, Karl
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Karim, Razuan
    University of Science and Technology Chittagong.
    An Interoperable IP based WSN for Smart Irrigation Systems2017Conference paper (Refereed)
    Abstract [en]

    Wireless Sensor Networks (WSN) have been highly developed which can be used in agriculture to enable optimal irrigation scheduling. Since there is an absence of widely used available methods to support effective agriculture practice in different weather conditions, WSN technology can be used to optimise irrigation in the crop fields. This paper presents architecture of an irrigation system by incorporating interoperable IP based WSN, which uses the protocol stacks and standard of the Internet of Things paradigm. The performance of fundamental issues of this network is emulated in Tmote Sky for 6LoWPAN over IEEE 802.15.4 radio link using the Contiki OS and the Cooja simulator. The simulated results of the performance of the WSN architecture presents the Round Trip Time (RTT) as well as the packet loss of different packet size. In addition, the average power consumption and the radio duty cycle of the sensors are studied. This will facilitate the deployment of a scalable and interoperable multi hop WSN, positioning of border router and to manage power consumption of the sensors.

  • 145.
    Lindberg, Renny
    et al.
    Vrije Universiteit Brussel.
    Laine, Teemu H.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Approaches to Detecting and Utilizing Play and Learning Styles in Adaptive Educational Games2017In: Computers Supported Education: 8th International Conference, CSEDU 2016, Rome, Italy, April 21-23, 2016, Revised Selected Papers / [ed] Gennaro Costagliola; James Uhomoibhi; Susan Zvacek; Bruce M. McLaren, Cham: Springer, 2017, p. 336-358Conference paper (Refereed)
    Abstract [en]

    Games have emerged as promising tools to make learning more fun. Pedagogical effectiveness of an educational game can increase if its behavior changes according to learners’ play and learning styles. Several models for categorizing learning and play styles exist, but not many studies simultaneously detect and utilize both style groups. To alleviate this, as the first contribution, we analyzed and compared existing learning and play style models, and chose the most suitable one from each group. Personality style models were also discussed. We then created a questionnaire based on Honey and Mumford’s Learning Style Questionnaire and Bartle’s Player Types, and collected data from 127 South Korean elementary school children. The results indicated that specific play styles were clearly more dominant (Killer 18%, Achiever 24%, Explorer 32%, Socializer 41%), whereas dominant learning styles were distributed more evenly (Activist 33%, Reflector 37%, Theorist 20% and Pragmatist 25%). As the second contribution, we presented the foundations of a generic adaptation model for utilizing learning and play styles for designing adaptive educational games.

  • 146.
    Sierla, Seppo A.
    et al.
    Department of Electrical Engineering and Automation, Aalto University.
    Karhela, Tommi A.
    Department of Electrical Engineering and Automation, Aalto University.
    Vyatkin, Valeriy
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science. Department of Electrical Engineering and Automation, Aalto University.
    Automatic Generation of Pipelines Into a 3D Industrial Process Model2017In: IEEE Access, ISSN 2169-3536, Vol. 5, p. 26591-26603Article in journal (Refereed)
    Abstract [en]

    Simulation has become an established technique to support the design of complex, mechatronic or cyber-physical systems. Ideally, simulations should already be performed at an early design phase before high-cost design commitments are made, and the recent advances in the digitalization of design information open possibilities for automatic generation of simulation models. However, high-fidelity simulation model building depends on accurate data. In particular, first-principles models are desirable source information for simulations, but such models generally are not available at an early design stage. This paper investigates the automatic generation of first-principles 3-D models for piping intensive systems based on design information that is available at an early design stage, namely piping & instrumentation diagrams (P&ID). An algorithm is presented for the generation of such 3-D models based on machine-readable P&ID information. The main focus of the algorithm is the automatic generation of feasible pipelines into the 3-D models, so that the model has sufficient information, which can be exploited in further work to automatically generate high fidelity first-principles thermo-hydraulic simulations.

  • 147.
    Kong, Linghe
    et al.
    Shanghai Key Laboratory of Scalable Computing and Systems at Shanghai Jiao Tong University, China..
    Ye, Lingsheng
    Shanghai Key Laboratory of Scalable Computing and Systems at Shanghai Jiao Tong University, China..
    Wu, Fan
    Shanghai Key Laboratory of Scalable Computing and Systems at Shanghai Jiao Tong University, China..
    Tao, Meixia
    Shanghai Key Laboratory of Scalable Computing and Systems at Shanghai Jiao Tong University, China..
    Chen, Guihai
    Shanghai Key Laboratory of Scalable Computing and Systems at Shanghai Jiao Tong University, China..
    Vasilakos, Athanasios V.
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Autonomous Relay for Millimeter-Wave Wireless Communications2017In: IEEE Journal on Selected Areas in Communications, ISSN 0733-8716, E-ISSN 1558-0008, Vol. 35, no 9, p. 2127-2136Article in journal (Refereed)
    Abstract [en]

    Millimeter-wave (mmWave) communication is the rising technology for next-generation wireless transmission. Benefited by its abundant bandwidth and short wavelength, mmWave is advanced in multi-gigabit transmittability and beamforming. In contrast, the short wavelength also makes mmWave easily blocked by obstacles. In order to bypass these obstacles, relays are widely needed in mmWave communications. Unmanned autonomous vehicles (UAVs), such as drones and self-driving robots, enable the mobile relays in real applications. Nevertheless, it is challenging for a UAV to find its optimal relay location automatically. On the one hand, it is difficult to find the location accurately due to the complex and dynamic wireless environment; on the other hand, most applications require the relay to forward data immediately, so the autonomous process should be fast. To tackle this challenge, we propose a novel method AutoRelay specialized for mmWave communications. In AutoRelay, the UAV samples the link qualities of mmWave beams while moving. Based on the real-time sampling, the UAV gradually adjusts its path to approach the optimal location by leveraging compressive sensing theory to estimate the link qualities in candidate space, which increases the accuracy and save the time. Performance results demonstrate that AutoRelay outperforms existing methods in achieving an accurate and efficient relay strategy.

  • 148.
    Ramadan, Rabie A.
    et al.
    Department of Computer Engineering, Cairo University, Egypt and Hail University.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Brain Computer Interface: control Signals Review2017In: Neurocomputing, ISSN 0925-2312, E-ISSN 1872-8286, Vol. 223, p. 26-44Article in journal (Refereed)
    Abstract [en]

    Brain Computer Interface (BCI) is defined as a combination of hardware and software that allows brain activities to control external devices or even computers. The research in this field has attracted academia and industry alike. The objective is to help severely disabled people to live their life as regular persons as much as possible. Some of these disabilities are categorized as neurological neuromuscular disorders. A BCI system goes through many phases including preprocessing, feature extraction, signal classifications, and finally control. Large body of research are found at each phase and this might confuse researchers and BCI developers. This article is a review to the state-of-the-art work in the field of BCI. The main focus of this review is on the Brain control signals, their types and classifications. In addition, this survey reviews the current BCI technology in terms of hardware and software where the most used BCI devices are described as well as the most utilized software platforms are explained. Finally, BCI challenges and future directions are stated. Due to the limited space and large body of literature in the field of BCI, another two review articles are planned. One of these articles reviews the up-to-date BCI algorithms and techniques for signal processing, feature extraction, signals classification, and control. Another article will be dedicated to BCI systems and applications. The three articles are written as base and guidelines for researchers and developers pursue the work in the field of BCI.

  • 149.
    Yan, Huan
    et al.
    School of Electronic and Information Engineering, Beijing Jiaotong University.
    Gao, Deyun
    School of Electronic and Information Engineering, Beijing Jiaotong University.
    Su, Wei
    School of Electronic and Information Engineering, Beijing Jiaotong University.
    Foh, Chuan Heng
    5G-IC, Department of Electrical and Electronic Engineering, Institute for Communication Systems, University of Surrey.
    Zhang, Hongke
    School of Electronic and Information Engineering, Beijing Jiaotong University.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Caching Strategy Based on Hierarchical Cluster for Named Data Networking2017In: IEEE Access, E-ISSN 2169-3536, Vol. 5, p. 8433-8443, article id 7898837Article in journal (Refereed)
    Abstract [en]

    The in-network caching strategy in named data networking can not only reduce the unnecessaryfetching of content from the original content server deep in the core network and improve the user responsetime, but also ease the trafc in the core network. However, challenges exist in in-network caching, suchas the distributed locations of storage and relatively small cache space which limit the hit rate, and thecache management introduces further overhead. In this paper, we propose a two-layer hierarchical clusterbasedcaching solution to improve in-network caching efciency. A network is grouped into several clusters,then, a clusterhead is nominated for each cluster to make caching decision. The clustering approach offersscalability and permits multiple aspects of inputs to be used for decision making. Our solution jointlyconsiders the location and content popularity for caching.We implement our strategy in ndnSIM and test it onGEANT-based network and AS3967 network. Our simulation results show signicant improvement over itspeers.INDEX

  • 150.
    Pan, Linqiang
    et al.
    Key Laboratory of Image Information Processing and Intelligent Control of Education Ministry of China, School of Automation, Huazhong University of Science and Technology, Wuhan.
    Wu, Tingfang
    Key Laboratory of Image Information Processing and Intelligent Control of Education Ministry of China, School of Automation, Huazhong University of Science and Technology, Wuhan.
    Su, Yansen
    Key Lab of Intelligent Computing and Signal Processing of Ministry of Education, School of Computer Science and Technology, Anhui University, Hefei.
    Vasilakos, Athanasios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    Cell-like spiking neural P systems with request rules2017In: IEEE Transactions on Nanobioscience, ISSN 1536-1241, E-ISSN 1558-2639, Vol. 16, no 6, p. 513-522Article in journal (Refereed)
    Abstract [en]