Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
GCHAR: An efficient Group-based Context–aware human activity recognition on smartphone
Nanjing University of Posts and Telecommunications.
Nanjing University of Posts and Telecommunications.
Nanjing University of Posts and Telecommunications.
Waseda University, Japan.
Show others and affiliations
2017 (English)In: Journal of Parallel and Distributed Computing, ISSN 0743-7315, E-ISSN 1096-0848Article in journal (Refereed) Epub ahead of print
Abstract [en]

With smartphones increasingly becoming ubiquitous and being equipped with various sensors, nowadays, there is a trend towards implementing HAR (human activity recognition) algorithms and applications on smartphones, including health monitoring, self-managing system and fitness tracking etc. However, one of main issues of the existing HAR schemes is that the classification accuracy is relatively low, and in order to improve the accuracy, high computation overhead is needed. In this paper, an efficient Group-based Context-aware classification method for human activity recognition on smartphones, GCHAR is proposed, which exploits hierarchical group-based scheme to improve the classification efficiency, and reduces the classification error through context awareness rather than the intensive computation. Specifically, GCHAR designs the two-level hierarchical classification structure, i.e., inter-group and inner-group, and utilizes the previous state and transition logic (so-called context awareness) to detect the transitions among activity groups. In comparison with other popular classifiers such as RandomTree, Bagging, J48, BayesNet, KNN and Decision Table, etc., thorough experiments on the realistic dataset (UCI HAR repository) demonstrate that GCHAR achieves the best classification accuracy, reaching 94.1636%, and time consumption in training stage of GCHAR is four times shorter than the simple Decision Table and is decreased by 72.21% in classification stage in comparison with BayesNet

Place, publisher, year, edition, pages
Elsevier, 2017.
National Category
Media and Communication Technology
Research subject
Mobile and Pervasive Computing
Identifiers
URN: urn:nbn:se:ltu:diva-63604DOI: 10.1016/j.jpdc.2017.05.007OAI: oai:DiVA.org:ltu-63604DiVA: diva2:1103358
Available from: 2017-05-30 Created: 2017-05-30 Last updated: 2017-06-02

Open Access in DiVA

No full text

Other links

Publisher's full text

Search in DiVA

By author/editor
Vasilakos, Athanasios
By organisation
Computer Science
In the same journal
Journal of Parallel and Distributed Computing
Media and Communication Technology

Search outside of DiVA

GoogleGoogle Scholar

Altmetric score

Total: 4 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf