Advertisement
Guest User

Untitled

a guest
Jun 15th, 2016
129
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 24.34 KB | None | 0 0
  1. Tentamen Human Computer Interaction - part 2
  2.  
  3. Lecture 7: Data gathering
  4.  
  5. Buzzword: Wisee.
  6.  
  7. Whole-home gesture recognition using wireless signals. Unlike Kinect etc. no infrastructure needed,
  8.  
  9. no cameras needed and through the wall and out-of- sight scenarios.
  10.  
  11. 7.2 – Five key issues
  12.  
  13. 1. Setting goals – Decide how to analyze data once collected
  14.  
  15. 2. Identifying participants – Decide who to gather data from
  16.  
  17. 3. Relationship with participants – Clear and professional, Informed consent when appropriate
  18.  
  19. 4. Triangulation – Look at data from more than one perspective
  20.  
  21. 5. Pilot studies – Small trial of main study
  22.  
  23. 7.3 – Data recording
  24.  
  25. Notes, audio, video photographs. Notes plus photographs. Audio plus photographs. Video,
  26.  
  27. interaction loggings, sensor-based (biometrics, speech/gestures).
  28.  
  29. Data recording in COMIC/SLOT, Spatial Logistics Task. Multimodal interaction. How do users perform
  30.  
  31. turn-taking, negotiate information, while using facial expressions, speech and gestures?
  32.  
  33. 7.4 – Interviews
  34.  
  35. Unstructured – are not directed by a script. Rich but not replicable.
  36.  
  37. Structured – are tightly scripted, often like a questionnaire. Replicable but may lack richness.
  38.  
  39. Semi-structured – guided by a script but interesting issues can be explored in more depth. Can
  40.  
  41. provide a good balance between richness and replicability.
  42.  
  43. Interview questions.
  44.  
  45. Two types: closed questions and open questions. Close questions are easier to analyze. Avoid long
  46.  
  47. questions, compound sentences, jargon and language that the interviewee might not understand,
  48.  
  49. leading questions that make assumptions, unconscious biases.
  50.  
  51. Running the interview.
  52.  
  53. - Introduction – introduce yourself, explain the goals of the interview, reassure about the ethical
  54.  
  55. issues, ask to record, present any informed consent form.
  56.  
  57. - Warm-up – make first questions easy and nonthreatening.
  58.  
  59. - Main body – present questions in a logical order
  60.  
  61. - A cool-off period – include a few easy questions to defuse tension at the end
  62.  
  63. - Closure – thank interviewee, signal the end, e.g, switch recorder off.
  64.  
  65. 7.5 - Questionnaires
  66.  
  67. Questions can be closed or open. Closed questions are easier to analyze, and may be done by
  68.  
  69. computer. Can be administered to large populations. Paper, email and the web used for
  70.  
  71. dissemination. Sampling can be a problem when the size of a population is unknown as is common
  72.  
  73. online
  74.  
  75. Questionnaire design.
  76.  
  77. The impact of a question can be influenced by question order. Do you need different versions of the
  78.  
  79. questionnaire for different populations? Provide clear instructions on how to complete the
  80.  
  81. questionnaire. Strike a balance between using white space and keeping the questionnaire compact.
  82.  
  83. Decide on whether phrases will all be positive, all negative or mixed.
  84.  
  85. Question and response format: ‘Yes’ and ‘No’ checkboxes. Checkboxes that offer many options.
  86.  
  87. Rating scales. Open-ended responses.
  88.  
  89. Encouraging a good response.
  90.  
  91. Make sure purpose of study is clear. Promise anonymity. Ensure questionnaire is well designed. Offer
  92.  
  93. a short version for those who do not have time to complete a long questionnaire. If mailed, include a
  94.  
  95. stamped addressed envelope. Follow-up with emails, phone calls, letters. Provide an incentive. 40%
  96.  
  97. response rate is high, 20% is often acceptable.
  98.  
  99. Advantages of online questionnaires.
  100.  
  101. Responses are usually received quickly. No copying and postage costs. Data can be collected in
  102.  
  103. database for analysis. Time required for data analysis is reduced. Errors can be corrected easily.
  104.  
  105. Problems with online questionnaires.
  106.  
  107. Sampling is problematic if population size is unknown. Preventing individuals from responding more
  108.  
  109. than once. Individuals have also been known to change questions in email questionnaires.
  110.  
  111. An online experiment with Maki
  112.  
  113. A typical AI/HF experiment. AI: synthesized music. HF: how do humans cope/perceive/appreciate
  114.  
  115. Data gathered:
  116.  
  117. - personal info: age/gender/language/experience
  118.  
  119. - how human-like does it sound?
  120.  
  121. - how do you like it?
  122.  
  123. - similarity between two sounds
  124.  
  125. Data gathered:
  126.  
  127. - Two main conditions (human/synthesized)
  128.  
  129. - Several sub-conditions (rhythm/contour/harmony)
  130.  
  131. 7.6 - Observation
  132.  
  133. Direct observation in the field
  134.  
  135. – Structuring frameworks
  136.  
  137. – Degree of participation (insider or outsider)
  138.  
  139. – Ethnography.
  140.  
  141. Direct observation in controlled environments.
  142.  
  143. Indirect observation: tracking users’ activities
  144.  
  145. – Diaries
  146.  
  147. – Interaction logging
  148.  
  149. 7.7 - Choosing and combining techniques
  150.  
  151. Depends on
  152.  
  153. – The focus of the study
  154.  
  155. – The participants involved
  156.  
  157. – The nature of the technique
  158.  
  159. – The resources available
  160.  
  161. Summary
  162.  
  163. Three main data gathering methods: interviews, questionnaires, observation. Five key issues of data
  164.  
  165. gathering: goals, choosing participants, triangulation, participant relationship, pilot. Interviews may
  166.  
  167. be structured, semi-structured or unstructured. Questionnaires may be on paper, online or
  168.  
  169. Telephone. Observation may be direct or indirect, in the field or in controlled setting. Techniques can
  170.  
  171. be combined depending on study focus, participants, nature of technique and available resources.
  172.  
  173. Lecture 9: The process of ID
  174.  
  175. Buzzword: The Tech Box.
  176.  
  177. What is involved in ID?
  178.  
  179. • It is a process:
  180.  
  181. – a goal-directed problem solving activity informed by intended use, target domain, materials,
  182.  
  183. cost, and feasibility
  184.  
  185. – a creative activity
  186.  
  187. – a decision-making activity to balance trade-offs
  188.  
  189. • Four approaches:
  190.  
  191. – user-centred design,
  192.  
  193. – activity-centred design,
  194.  
  195. – systems design, and
  196.  
  197. – genius design
  198.  
  199. Importance of involving users
  200.  
  201. • Expectation management
  202.  
  203. – Realistic expectations
  204.  
  205. – No surprises, no disappointments
  206.  
  207. – Timely training
  208.  
  209. – Communication, but no hype
  210.  
  211. • Ownership
  212.  
  213. – Make the users active stakeholders
  214.  
  215. – More likely to forgive or accept problems
  216.  
  217. – Can make a big difference to acceptance and success of product
  218.  
  219. Degrees of user involvement
  220.  
  221. • Member of the design team
  222.  
  223. – Full time: constant input, but lose touch with users
  224.  
  225. – Part time: patchy input, and very stressful
  226.  
  227. – Short term: inconsistent across project life
  228.  
  229. – Long term: consistent, but lose touch with users
  230.  
  231. • Newsletters and other dissemination devices
  232.  
  233. – Reach wider selection of users
  234.  
  235. – Need communication both ways
  236.  
  237. • User involvement after product is released
  238.  
  239. • Combination of these approaches
  240.  
  241. User-centered approach is based on:
  242.  
  243. – Early focus on users and tasks: directly studying cognitive, behavioral, anthropomorphic &
  244.  
  245. attitudinal characteristics
  246.  
  247. – Empirical measurement: users’ reactions and performance to scenarios, manuals, simulations &
  248.  
  249. prototypes are observed, recorded and analysed
  250.  
  251. – Iterative design: when problems are found in user testing, fix them and carry out more tests
  252.  
  253. Four basic activities in ID:
  254.  
  255. 1. Establishing requirements 2. Designing alternatives 3. Prototyping 4. Evaluating
  256.  
  257. Some practical issues: Who are the users? What do we mean by ‘needs’? How to generate
  258.  
  259. alternatives. How to choose among alternatives. How to integrate interaction design activities with
  260.  
  261. other models?
  262.  
  263. Who are the users/stakeholders?
  264.  
  265. • Not as obvious as you think: those who interact directly with the product, those who manage
  266.  
  267. direct users, those who receive output from the product, those who make the purchasing decision,
  268.  
  269. those who use competitor’s products.
  270.  
  271. • Three categories of user (Eason, 1987):
  272.  
  273. – primary: frequent hands-on
  274.  
  275. – secondary: occasional or via someone else
  276.  
  277. – tertiary: affected by its introduction, or will influence its purchase
  278.  
  279. What do we mean by ‘needs’?
  280.  
  281. • Users rarely know what is possible
  282.  
  283. • Users can’t tell you what they ‘need’ to help them achieve their goals
  284.  
  285. • Instead, look at existing tasks: Their context. What information do they require? Who collaborates
  286.  
  287. to achieve the task? Why is the task achieved the way it is?
  288.  
  289. • Envisioned tasks: Can be rooted in existing behaviour, can be described as future scenarios.
  290.  
  291. How to generate alternatives
  292.  
  293. Humans stick to what they know works. But considering alternatives is important to ‘break out of the
  294.  
  295. box’. Designers are trained to consider alternatives, software people generally are not. How do you
  296.  
  297. generate alternatives? 1)‘Flair and creativity’: research and synthesis, 2) Seek inspiration: look at
  298.  
  299. similar products or look at very different products .
  300.  
  301. How to choose among alternatives
  302.  
  303. • Evaluation with users or with peers, e.g. prototypes
  304.  
  305. • Technical feasibility: some not possible
  306.  
  307. • Quality thresholds: Usability goals lead to usability criteria set early on and check regularly —safety:
  308.  
  309. how safe? —utility: which functions are superfluous? —effectiveness: appropriate support? task
  310.  
  311. coverage, information available —efficiency: performance measurements
  312.  
  313. Genius design: inspiration based, intuition, experience, expertise, team work.
  314.  
  315. UCD:
  316.  
  317. • You like predictable, measurable results
  318.  
  319. • User testing is a significant source of decisionmaking confidence for you and your team
  320.  
  321. • You are risk averse
  322.  
  323. • You have the time and budget to devote to repeated testing and validation
  324.  
  325. Genius:
  326.  
  327. • You are working with a highly experienced team
  328.  
  329. • You trust your colleague’s intuition
  330.  
  331. • You trust your own intuition
  332.  
  333. • You have a deep understanding of your end users’ ultimate goals
  334.  
  335. Summary
  336.  
  337. Four basic activities in the design process:
  338.  
  339. 1. Establishing requirements
  340.  
  341. 2. Designing alternatives
  342.  
  343. 3. Prototyping
  344.  
  345. 4. Evaluating
  346.  
  347. User-centered design rests on three principles
  348.  
  349. 1. Early focus on users and tasks
  350.  
  351. 2. Empirical measurement using quantifiable & measurable usability criteria
  352.  
  353. 3. Iterative design
  354.  
  355. Lecture 10: Establishing requirements
  356.  
  357. Buzzword: chord typing. Asetniop.
  358.  
  359. Four basic activities in ID: 1) Establishing requirements, 2) designing alternatives, 3) prototyping, 4)
  360.  
  361. evaluating.
  362.  
  363. What: Two aims:
  364.  
  365. 1. Understand as much as possible about users, task, context
  366.  
  367. 2. Produce a stable set of requirements
  368.  
  369. How: Data gathering activities
  370.  
  371. Data analysis activities
  372.  
  373. Expression as ‘requirements’
  374.  
  375. All of this is iterative
  376.  
  377. Boehm & basili (2001): finding and fixing a software problem after delivery is often at least 100 times
  378.  
  379. more expensive than finding and fixing it during the requirements and design phase.
  380.  
  381. Why: Requirements definition: the stage where failure occurs most commonly. Getting
  382.  
  383. requirements right is crucial.
  384.  
  385. What is a requirement? A statement about an intended product that specifies what it should do or
  386.  
  387. how it should perform or how it should look and feel.
  388.  
  389. What do users want? What do users ‘need’?
  390.  
  391. Requirements need clarification, refinement, completion, re-scoping Input: requirements document
  392.  
  393. (maybe) Output: stable requirements.
  394.  
  395. Why ‘establish’?
  396.  
  397. Requirements arise from understanding users’ needs Requirements can be justified & related to
  398.  
  399. data.
  400.  
  401. Different kinds of requirements
  402.  
  403. • Functional: What the system should do, Historically the main focus of requirements activities
  404.  
  405. • Non-functional: memory size, response time...
  406.  
  407. • Data: What kinds of data need to be stored? How will they be stored (e.g. database)?
  408.  
  409. • Users: Who are they? — Characteristics: ability, background, attitude to computers
  410.  
  411. — Novice: step-by- step (prompted), constrained, clear information
  412.  
  413. — System use: novice, expert, casual, frequent
  414.  
  415. — Expert: flexibility, access/power
  416.  
  417. — Frequent: short cuts
  418.  
  419. — Casual/infrequent: clear instructions, e.g. menu paths
  420.  
  421. Environment or context of use:
  422.  
  423. — physical: dusty? noisy? vibration? light? heat? humidity? …. (e.g. OMS insects, ATM)
  424.  
  425. — social: sharing of files, of displays, in paper, across great distances, work individually, privacy for
  426.  
  427. clients
  428.  
  429. — organisational: hierarchy, IT department’s attitude, user support, communications structure and
  430.  
  431. infrastructure, availability of training
  432.  
  433. — technical: standards, hardware, limitations
  434.  
  435. Personas
  436.  
  437. • Capture user characteristics
  438.  
  439. • Not real people, but synthesised from real user characteristics
  440.  
  441. • Should not be idealised
  442.  
  443. • Bring them to life with a name, characteristics, goals, personal background
  444.  
  445. • Develop multiple personas
  446.  
  447. Data gathering for requirements
  448.  
  449. Interviews: — Prototypes can be used in interviews
  450.  
  451. — Good for exploring issues
  452.  
  453. — But are time consuming and may be infeasible to visit everyone
  454.  
  455. Focus groups: — Group interviews:
  456.  
  457. — Good at gaining a consensus view and/or highlighting areas of conflict
  458.  
  459. — But can be dominated by individuals
  460.  
  461. Questionnaires: — Often used in conjunction with other techniques
  462.  
  463. — Can give quantitative or qualitative data
  464.  
  465. — Good for answering specific questions from a large, dispersed group of people
  466.  
  467. Researching similar products: — Good for prompting requirements
  468.  
  469. Direct observation: — Gain insights into stakeholders’ tasks
  470.  
  471. — Good for understanding the nature and context of the tasks
  472.  
  473. — But, it requires time and commitment from a member of the design team,
  474.  
  475. and it can result in a huge amount of data
  476.  
  477. Indirect observation: — Not often used in requirements activity
  478.  
  479. — Good for logging current tasks
  480.  
  481. Contextual Inquiry
  482.  
  483. • An approach to ethnographic study where user is expert, designer is apprentice
  484.  
  485. • A form of interview, but — at users’ workplace (workstation)
  486.  
  487. • Four main principles: — Context: see workplace & what happens
  488.  
  489. — 2 to 3 hours long
  490.  
  491. — Partnership: user and developer collaborate
  492.  
  493. — Interpretation: observations interpreted by user and developer
  494.  
  495. together
  496.  
  497. — Focus: project focus to understand what to look for
  498.  
  499. Problems with data gathering
  500.  
  501. • Identifying and involving stakeholders: users, managers, developers, customer reps?, union reps?,
  502.  
  503. shareholders?
  504.  
  505. • Involving stakeholders: workshops, interviews, workplace studies, co-opt stakeholders onto the
  506.  
  507. development team
  508.  
  509. • ‘Real’ users, not managers: traditionally a problem in software engineering, but better now
  510.  
  511. Requirements management: version control, ownership
  512.  
  513. • Communication between parties: —within development team —with customer/user —between
  514.  
  515. users… different parts of an organisation use different terminology
  516.  
  517. • Domain knowledge distributed and implicit: —difficult to dig up and understand —knowledge
  518.  
  519. articulation: how do you walk?
  520.  
  521. • Availability of key people
  522.  
  523. • Political problems within the organisation
  524.  
  525. • Dominance of certain stakeholders
  526.  
  527. • Economic and business environment changes
  528.  
  529. • Balancing functional and usability demands
  530.  
  531. Some basic guidelines
  532.  
  533. Focus on identifying the stakeholders’ needs
  534.  
  535. • Involve all the stakeholder groups
  536.  
  537. • Involve more than one representative from each stakeholder group
  538.  
  539. • Use a combination of data gathering techniques
  540.  
  541. • Support the process with props such as prototypes and task descriptions
  542.  
  543. • Run a pilot session
  544.  
  545. • You will need to compromise on the data you collect and the analysis to be done, but before you
  546.  
  547. can make sensible compromises, you need to know what you’d really like
  548.  
  549. • Consider carefully how to record the data
  550.  
  551. Data interpretation and analysis
  552.  
  553. • Start soon after data gathering session
  554.  
  555. • Initial interpretation before deeper analysis
  556.  
  557. • Different approaches emphasize different elements e.g. class diagrams for object-oriented systems,
  558.  
  559. entity-relationship diagrams for data intensive systems
  560.  
  561. Task descriptions
  562.  
  563. • Scenarios ― an informal narrative story, simple, ‘natural’, personal, not generalisable
  564.  
  565. • Use cases — assume interaction with a system
  566.  
  567. — assume detailed understanding of the interaction
  568.  
  569. • Essential use cases — abstract away from the details
  570.  
  571. — does not have the same assumptions as use cases
  572.  
  573. — more structured and distinguishing between system/user
  574.  
  575. Task analysis
  576.  
  577. •Task descriptions are often used to envision new systems or devices
  578.  
  579. • Task analysis is used mainly to investigate an existing situation
  580.  
  581. • It is important not to focus on superficial activities What are people trying to achieve? Why are
  582.  
  583. they trying to achieve it? How are they going about it?
  584.  
  585. • Many techniques, the most popular is Hierarchical Task Analysis (HTA)
  586.  
  587. Hierarchical Task Analysis
  588.  
  589. • Involves breaking a task down into subtasks, then sub-sub- tasks and so on. These are grouped as
  590.  
  591. plans which specify how the tasks might be performed in practice
  592.  
  593. • HTA focuses on physical and observable actions, and includes looking at actions not related to
  594.  
  595. software or an interaction device
  596.  
  597. • Start with a user goal which is examined and the main tasks for achieving it are identified
  598.  
  599. • Tasks are sub-divided into sub-tasks
  600.  
  601. Summary
  602.  
  603. • Getting requirements right is crucial
  604.  
  605. • There are different kinds of requirement, each is significant for interaction design
  606.  
  607. • The most commonly-used techniques for data gathering are: questionnaires, interviews, focus
  608.  
  609. groups, direct observation, studying documentation and researching similar products
  610.  
  611. • Scenarios, use cases and essential use cases can be used to articulate existing and envisioned work
  612.  
  613. practices.
  614.  
  615. • Task analysis techniques such as HTA help to investigate existing systems and practices
  616.  
  617. Lecture 11: Prototyping and building
  618.  
  619. Buzzword: HRI.
  620.  
  621. Human-Robot Interaction (HRI) is a relatively young discipline that has attracted a lot of attention
  622.  
  623. over the past few years due to the increasing availability of complex robots and people's exposure to
  624.  
  625. such robots in their daily lives, e.g. as robotic toys or, to some extent, as household appliances
  626.  
  627. (robotic vacuum cleaners or lawn movers). Also, robots are increasingly being developed for real
  628.  
  629. world application areas, such as robots in rehabilitation, eldercare, or robots used in robotassisted
  630.  
  631. therapy and other assistive or educational applications.
  632.  
  633. HRI as a research domain is a synthetic science, and it should tackle the whole range of challenges
  634.  
  635. from technical, cognitive/AI to psychological, social, cognitive and behavioural.
  636.  
  637. What is a prototype? In other design fields a prototype is a small-scale model: for example a
  638.  
  639. miniature car.
  640.  
  641. In interaction design it can be (among other things):
  642.  
  643. • a series of screen sketches
  644.  
  645. • a storyboard, i.e. a cartoon-like series of scenes
  646.  
  647. • a Powerpoint slide show
  648.  
  649. • a video simulating the use of a system
  650.  
  651. • a lump of wood (e.g. PalmPilot)
  652.  
  653. • a cardboard mock-up
  654.  
  655. • a piece of software with limited functionality written in the target/another language
  656.  
  657. Why prototype?
  658.  
  659. • Evaluation and feedback are central to interaction design
  660.  
  661. • Stakeholders can see, hold, interact with a prototype more easily than a document or a drawing
  662.  
  663. • Team members can communicate effectively
  664.  
  665. • You can test out ideas for yourself
  666.  
  667. • It encourages reflection: very important aspect of design
  668.  
  669. • Prototypes answer questions, and support designers in choosing between alternatives
  670.  
  671. What to prototype? • Technical issues • Work flow, task design • Screen layouts and information
  672.  
  673. display • Difficult, controversial, critical areas
  674.  
  675. Low-fidelity Prototyping uses a medium which is unlike the final medium, e.g. paper, cardboard and
  676.  
  677. it is quick, cheap and easily changed.
  678.  
  679. ‘Wizard-of- Oz’ prototyping
  680.  
  681. • The user thinks they are interacting with a computer, but a developer is responding to output
  682.  
  683. rather than the system.
  684.  
  685. • Usually done early in design to understand users’ expectations
  686.  
  687. High-fidelity prototyping
  688.  
  689. • Uses materials that you would expect to be in the final product.
  690.  
  691. • Prototype looks more like the final system than a lowfidelity version.
  692.  
  693. • For a high-fidelity software prototype common environments include Macromedia Director, Visual
  694.  
  695. Basic, and Smalltalk.
  696.  
  697. • Danger that users think they have a full system…….see compromises
  698.  
  699. Compromises in prototyping
  700.  
  701. • All prototypes involve compromises
  702.  
  703. • For software-based prototyping maybe there is a slow response? sketchy icons? limited
  704.  
  705. functionality?
  706.  
  707. • Two common types of compromise: ‘horizontal’: provide a wide range of functions, but with little
  708.  
  709. detail and ‘vertical’: provide a lot of detail for only a few functions
  710.  
  711. • Compromises in prototypes mustn’t be ignored. Product needs engineering.
  712.  
  713. Construction
  714.  
  715. • Taking the prototypes (or learning from them) and creating a whole
  716.  
  717. • Quality must be attended to: usability (of course), reliability, robustness, maintainability, integrity,
  718.  
  719. portability, efficiency, etc
  720.  
  721. • Product must be engineered Evolutionary prototyping ‘Throw-away’ prototyping
  722.  
  723. Conceptual design: from requirements to design
  724.  
  725. • Transform user requirements/needs into a conceptual model
  726.  
  727. • “a description of the proposed system in terms of a set of integrated ideas and concepts about
  728.  
  729. what it should do, behave and look like, that will be understandable by the users in the manner
  730.  
  731. intended”
  732.  
  733. • Don’t move to a solution too quickly. Iterate, iterate, iterate
  734.  
  735. • Consider alternatives: prototyping helps
  736.  
  737. Is there a suitable metaphor?
  738.  
  739. • Interface metaphors combine familiar knowledge with new knowledge in a way that will help the
  740.  
  741. user understand the product.
  742.  
  743. • Three steps: understand functionality, identify potential problem areas, generate metaphors
  744.  
  745. • Evaluate metaphors: How much structure does it provide? How much is relevant to the problem? Is
  746.  
  747. it easy to represent? Will the audience understand it? How extensible is it?
  748.  
  749. Considering interaction types
  750.  
  751. • Which interaction type? How the user invokes actions. Instructing, conversing, manipulating or
  752.  
  753. exploring
  754.  
  755. • Do different interface types provide insight? WIMP, shareable, augmented reality, etc
  756.  
  757. Expanding the conceptual model
  758.  
  759. • What functions will the product perform? What will the product do and what will the human do
  760.  
  761. (task allocation)?
  762.  
  763. • How are the functions related to each other? Sequential or parallel? Categorisations, e.g. all actions
  764.  
  765. related to telephone memory storage
  766.  
  767. • What information needs to be available? What data is required to perform the task? How is this
  768.  
  769. data to be transformed by the system?
  770.  
  771. Summary
  772.  
  773. • Different kinds of prototyping are used for different purposes and at different stages
  774.  
  775. • Prototypes answer questions, so prototype appropriately
  776.  
  777. • Construction: the final product must be engineered appropriately
  778.  
  779. • Conceptual design (the first step of design)
  780.  
  781. • Consider interaction types and interface types to prompt creativity
  782.  
  783. • Storyboards can be generated from scenarios
  784.  
  785. • Card-based prototypes can be generated from use cases
  786.  
  787. Lecture 12: ID in Practice
  788.  
  789. Buzzword: BCI. Wolpaw (2002): a BCI translates brain signals into control commands for steering a
  790.  
  791. device or neural interface.
  792.  
  793. Agile development
  794.  
  795. • Short (one to three week) timeboxes of iterative development (sprint, iteration, cycle)
  796.  
  797. • Early and repeated customer/user feedback
  798.  
  799. • Re-prioritisation of work based on customer/user so that emergent requirements can be handled
  800.  
  801. AgileUX
  802.  
  803. • Integrates techniques from interaction design and Agile software development
  804.  
  805. • AgileUX requires a change of mindset
  806.  
  807. • In Agile, as implementation proceeds: – requirements are elaborated – requirements are re-
  808.  
  809. prioritised
  810.  
  811. • All techniques in UX are still relevant but when and how much needs re-thinking – focus on
  812.  
  813. product, not design, as deliverable – cross-functional teams
  814.  
  815. User research
  816.  
  817. • Aims to characterise users through data collection and analysis
  818.  
  819. • Agile’s timeboxing approach does not support long periods of user research
  820.  
  821. • User evaluations and some detailed work can be fitted within a timebox
  822.  
  823. • Some user research can be performed in iteration 0 (zero), before implementation starts
  824.  
  825. • Ongoing programme of user research
  826.  
  827. Aligning work practices
  828.  
  829. • Designing a complete product upfront causes problems because of re-prioritisation
  830.  
  831. • Some upfront work is needed (technical and UX)
  832.  
  833. • Use a parallel tracks approach: – create product vision before development starts, (cycles 0 and 1)
  834.  
  835. – do design work one iteration ahead of development – some teams work two iterations ahead
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement