Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
GCHAR: An efficient Group-based Context–aware human activity recognition on smartphone
Nanjing University of Posts and Telecommunications, China.
Nanjing University of Posts and Telecommunications, China.
Nanjing University of Posts and Telecommunications, China.
Waseda University, Japan.
Visa övriga samt affilieringar
2018 (Engelska)Ingår i: Journal of Parallel and Distributed Computing, ISSN 0743-7315, E-ISSN 1096-0848, Vol. 118, nr part-1, s. 67-80Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

With smartphones increasingly becoming ubiquitous and being equipped with various sensors, nowadays, there is a trend towards implementing HAR (Human Activity Recognition) algorithms and applications on smartphones, including health monitoring, self-managing system and fitness tracking. However, one of the main issues of the existing HAR schemes is that the classification accuracy is relatively low, and in order to improve the accuracy, high computation overhead is needed. In this paper, an efficient Group-based Context-aware classification method for human activity recognition on smartphones, GCHAR is proposed, which exploits hierarchical group-based scheme to improve the classification efficiency, and reduces the classification error through context awareness rather than the intensive computation. Specifically, GCHAR designs the two-level hierarchical classification structure, i.e., inter-group and inner-group, and utilizes the previous state and transition logic (so-called context awareness) to detect the transitions among activity groups. In comparison with other popular classifiers such as RandomTree, Bagging, J48, BayesNet, KNN and Decision Table, thorough experiments on the realistic dataset (UCI HAR repository) demonstrate that GCHAR achieves the best classification accuracy, reaching 94.1636%, and time consumption in training stage of GCHAR is four times shorter than the simple Decision Table and is decreased by 72.21% in classification stage in comparison with BayesNet.

Ort, förlag, år, upplaga, sidor
Elsevier, 2018. Vol. 118, nr part-1, s. 67-80
Nationell ämneskategori
Medieteknik
Forskningsämne
Distribuerade datorsystem
Identifikatorer
URN: urn:nbn:se:ltu:diva-63604DOI: 10.1016/j.jpdc.2017.05.007ISI: 000434003300008Scopus ID: 2-s2.0-85021154566OAI: oai:DiVA.org:ltu-63604DiVA, id: diva2:1103358
Anmärkning

Validerad;2018;Nivå 2;2018-05-15 (rokbeg)

Tillgänglig från: 2017-05-30 Skapad: 2017-05-30 Senast uppdaterad: 2018-07-24Bibliografiskt granskad

Open Access i DiVA

Fulltext saknas i DiVA

Övriga länkar

Förlagets fulltextScopus

Personposter BETA

Vasilakos, Athanasios

Sök vidare i DiVA

Av författaren/redaktören
Vasilakos, Athanasios
Av organisationen
Datavetenskap
I samma tidskrift
Journal of Parallel and Distributed Computing
Medieteknik

Sök vidare utanför DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetricpoäng

doi
urn-nbn
Totalt: 15 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf