Day 1 RE’06 (update from Joy)

ArgonDigital - enterprise automation experts

Share This Post

“That’s neat, but…so what?”

I noticed myself frequently thinking these words at RE’06. Like Tony, I found myself disappointed that academia seems removed from the practical world of requirements engineering. I believe it’s supposed to work like this – the academics do research in new areas, come up with brilliant new ideas, and then apply them to problems in industry. While there were a few exceptions to this, most of what I saw failed to be clearly applicable to real problems.

I will provide summaries of talks that Tony has not already commented on.

I saw the third paper presented in the “Languages, Methods and Tools” session, “Making Mobile Requirements Engineering Tools Usable and Useful”. The authors were trying to address what mobile computing devices mean to the requirements engineering field. The authors developed a prototype application for a PDA to be used to elicit requirements. Such a tool would allow requirements to be discovered in the work-based environment (while walking around even). There were obvious limitations with screen size, no keyboard, and ultimately had general usability issues.

The interesting question for me was – in what situations would this be useful? How often do we really need to gather requirements where we cannot use a laptop (or even a tablet PC)? The presenter indicated there was a specific need for someone to do this, but he did not elaborate on what it was.

The afternoon of Day 1 was spent in the session of research papers on “Non-Functional Requirements”.

The first paper in this group was titled “The Detection and Classification of Non-Functional Requirements with Application to Early Aspects”. The authors’ goal was to develop a method to be able to automatically detect and classify the non-functional requirements in standard documentation materials (notes, emails, documentation). In simple terms, there are key words associated with non-functional requirement types. For each sentence in the document, the key words are used to calculate a likelihood of it being of that particular type, and if it is above a threshold, it is then classified as a requirement of that type.

It’s not clear to me that there is a benefit gained from such methods. If the method is not perfect (and I will argue it can’t be), then you still have to parse through your documentation manually so that you do not miss any requirements. If it is not perfect in another direction (and again I don’t think it can be), you also still have to read through all of the requirements extracted to find the ones that are not actually requirements or are miss-classified. Given that, I’m skeptical that there will be a benefit gained from such methods.

I found the second paper in this session, “Emotional Requirements in Video Games”, to be quite interesting. The premise is that video games are designed using emotional requirements, with a target emotional state and means to induce that state in a player. Emotional requirements describe the story of the player’s experience, where and when the emotions should be felt, and how they vary over time. This type of requirement is very prevalent in video game software and can be challenging to capture, so this paper provides an approach to that problem. They propose using emotional terrain maps (where there is emotion), emotional intensity maps (what emotion) and timelines (how it varies) as visual representations of the requirements.

There was a brief conversation at the end by other researchers in the room, proposing additional applications of the techniques. Suggestions were in process workflows, such as optimizing workflow on a factory floor or robot planning. There was also an interesting discussion about further work desired in the gaming industry on how to verify this type of requirements.

The third paper in this session was called “Towards Regulatory Compliance: Extracting Rights and Obligations to Align Requirements with Regulations”. There are regulations that software systems often must enforce. The authors developed a systematic method to parse the legal documentation to extract and prioritize the rights and obligations. They identify the constraints, resolve ambiguities and trace back to the original policy to demonstrate compliance. In their research, they used HIPPA as a case study.

The final event of the day for us was the poster session. Largely the posters were challenging to understand, and the students were not present for explanation when we stopped by. I was somewhat intrigued by one titled “So, You Think You are a Requirements Engineer?” The authors present a suggested map of skill areas against experience, where the skill areas are elicitation, analysis, communication, validation and management. I’m mostly curious to see the findings of an empirical study in the community that they have started.

More To Explore

b2b auto pay

B2B Auto Pay: Automation Use Cases

Migrating a B2B “Auto Pay” Program Companies migrating to SAP often have daunting challenges to overcome in Accounts Receivable as part of the transition. You might have different divisions running

ArgonDigital | Making Technology a Strategic Advantage