CRAFT: A crowd-annotated feedback technique
Authors: Hosseini, M., Groen, E.C., Shahri, A. and Ali, R.
Journal: Proceedings - 2017 IEEE 25th International Requirements Engineering Conference Workshops, REW 2017
Pages: 170-175
DOI: 10.1109/REW.2017.27
Abstract:The ever increasing accessibility of the web for the crowd offered by various electronic devices such as smartphones has facilitated the communication of the needs, ideas, and wishes of millions of stakeholders. To cater for the scale of this input and reduce the overhead of manual elicitation methods, data mining and text mining techniques have been utilised to automatically capture and categorise this stream of feedback, which is also used, amongst other things, by stakeholders to communicate their requirements to software developers. Such techniques, however, fall short of identifying some of the peculiarities and idiosyncrasies of the natural language that people use colloquially. This paper proposes CRAFT, a technique that utilises the power of the crowd to support richer, more powerful text mining by enabling the crowd to categorise and annotate feedback through a context menu. This, in turn, helps requirements engineers to better identify user requirements within such feedback. This paper presents the theoretical foundations as well as the initial evaluation of this crowd-based feedback annotation technique for requirements identification.
https://eprints.bournemouth.ac.uk/29514/
Source: Scopus
CRAFT: A Crowd-Annotated Feedback Technique
Authors: Hosseini, M., Groen, E.C., Shahri, A. and Ali, R.
Journal: 2017 IEEE 25TH INTERNATIONAL REQUIREMENTS ENGINEERING CONFERENCE WORKSHOPS (REW)
Pages: 170-175
DOI: 10.1109/REW.2017.27
https://eprints.bournemouth.ac.uk/29514/
Source: Web of Science (Lite)
CRAFT: A Crowd-Annotated Feedback Technique
Authors: Hosseini, M., Groen, E., Shahri, A. and Ali, R.
Conference: CrowdRE: 2nd International Workshop on Crowd-Based Requirements Engineering
Dates: 4 September 2017
https://eprints.bournemouth.ac.uk/29514/
Source: Manual
CRAFT: A Crowd-Annotated Feedback Technique
Authors: Hosseini, M., Groen, E., Shahri, A. and Ali, R.
Conference: CrowdRE: 2nd International Workshop on Crowd-Based Requirements Engineering
Abstract:The ever increasing accessibility of the web for the crowd offered by various electronic devices such as smartphones has facilitated the communication of the needs, ideas, and wishes of millions of stakeholders. To cater for the scale of this input and reduce the overhead of manual elicitation methods, data mining and text mining techniques have been utilised to automatically capture and categorise this stream of feedback, which is also used, amongst other things, by stakeholders to communicate their requirements to software developers. Such techniques, however, fall short of identifying some of the peculiarities and idiosyncrasies of the natural language that people use colloquially. This paper proposes CRAFT, a technique that utilises the power of the crowd to support richer, more powerful text mining by enabling the crowd to categorise and annotate feedback through a context menu. This, in turn, helps requirements engineers to better identify user requirements within such feedback. This paper presents the theoretical foundations as well as the initial evaluation of this crowd-based feedback annotation technique for requirements identification.
https://eprints.bournemouth.ac.uk/29514/
https://crowdre.github.io/ws-2017/program.html
Source: BURO EPrints