1 / 65

Week 4

Week 4. IA Review, TRS, Project Site, FTE. My definition of “legacy data”. Establishing Breadth: Card Sorting. Exploratory card sorting can be helpful Provide users with the content pieces and have them sort the content into related groupings, then label the groupings

kerri
Télécharger la présentation

Week 4

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Week 4 IA Review, TRS, Project Site, FTE

  2. My definition of “legacy data”

  3. Establishing Breadth: Card Sorting • Exploratory card sorting can be helpful • Provide users with the content pieces and have them sort the content into related groupings, then label the groupings • This is useful primarily for establishing breadth and site structure (hierarchy) • User-supplied labels can sometimes be good at conveying scent

  4. Exploratory Card Sort Process 1. Orient the user (What is the site? Task?) 2. The user groups related cards into piles 3. The user assigns one label to each pile 4. Can the piles be subdivided further? 5. Label each of the smaller sub-piles 6. Sometimes further subdivision is needed 7. Record the groupings and labels 8. Repeat with another user

  5. Card Sorting Example: Election Website Candidate’s bio Election issues Press releases Campaign events Speeches Campaign donations Media coverage Campaign timeline Voter registration Website feedback Newsletter Endorsements Ask the candidate a question On-the-road journal Candidate’s record and accomplishments Let a friend know about this website Candidate comparison Privacy policy Related links Sitemap Frequently asked questions Volunteering Campaign staff and openings

  6. Create Primary Groups Candidate’s bio Privacy policy Candidate’s record and accomplishments Website feedback Campaign staff and openings Ask the candidate a question Sitemap Related links Press releases Election issues On-the-road journal Candidate comparison Campaign timeline Frequently asked questions Volunteering Media coverage Endorsements Speeches Voter registration Newsletter Campaign donations Campaign events Let a friend know about this website

  7. Label Primary Groups Privacy policy About the Candidate Website feedback Candidate’s bio Sitemap Candidate’s record and accomplishments Related links Campaign staff and openings Ask the candidate a question News & Events Press releases On The Issues On-the-road journal Getting Involved Election issues Campaign timeline Volunteering Candidate comparison Media coverage Endorsements Frequently asked questions Speeches Voter registration Newsletter Campaign donations Campaign events Let a friend know about this website

  8. Create Secondary Groups News & Events Campaign timeline Media coverage Campaign events Press releases Newsletter On-the-road journal Speeches

  9. Label Secondary Groups News & Events Events In the Media Campaign timeline Media coverage Campaign events Press releases Newsletter News from the Candidate On-the-road journal Speeches

  10. Analyzing the Data • ‘Eyeball’ the data for common groupings and number of top level categories • Use a program for analysis (as well as administration of the card sort): • EZSort/USort • Web CAT

  11. Breadth and Similarity Matching • User rates on a scale of 1-10 the similarity of every possible pairing of content cards • Cluster analysis creates the groups by crunching the numbers and seeing which items are rated as being most similar • No labels are suggested for each cluster of content items, but hopefully a clear label emerges from examining the groupings

  12. Assessing Scent Quality • Two techniques help in assessing the quality of information scent: • Confirmatory card sorting • User testing

  13. Confirmatory Card Sorting • Conducted after the site architecture has been developed • Asks the question: Do users expect to find content under the ‘right’ label? • If users sort content under the ‘wrong’ label (or cannot place the content at all), that strongly suggests scent issues with the current labeling

  14. Confirmatory Card Sort Process 1. Orient the user (What is the site? Task?) 2. Lay out cards with global navigation labels 3. User puts content cards under the appropriate global navigation label 4. Lay out cards with second-level labels 5. User subdivides content cards under new second-level labels 6. Lay out third-level cards and sort further

  15. Provide Global Navigation Cards About the Candidate Related Links Getting Involved Sitemap News & Events On The Issues Website Feedback Privacy Policy

  16. First Pass at Dividing Cards About the Candidate Related Links Getting Involved Sitemap Candidate’s bio Endorsements Candidate’s record and accomplishments Voter registration Campaign staff and openings Campaign donations Ask the candidate a question Let a friend know about this website News & Events On The Issues Website Feedback Privacy Policy Press releases Election issues On-the-road journal Candidate comparison Campaign timeline Frequently asked questions Media coverage Speeches Newsletter Campaign events

  17. Provide Second-Level Labels News & Events Press releases On-the-road journal Campaign timeline Media coverage Speeches Newsletter Campaign events Events News from the Candidate In the Media

  18. Further Subdivision Occurs News & Events Events In the Media Campaign timeline Media coverage Press releases Campaign events Newsletter News from the Candidate On-the-road journal Speeches

  19. User Testing and Scent • User testing of information scent tends to work best with focused, information-seeking tasks

  20. Quantitative User Test Metrics • Path directness • determine ‘optimal path’ and number of clicks • calculate number of clicks it takes user to reach destination and compare • Path frequency • which paths are chosen most frequently? • Time & Completion Rate • Satisfaction

  21. Qualitative User Test Metrics • User comments • both written and verbal • Signs of indecision • hovering back and forth between two global navigation links • Indications of frustration and confusion

  22. Supporting Information Scent • From user testing in particular lots of suggestions arise for supporting information scent • These often relate more to interface design decisions than to conceptual design

  23. Support Options • Scope indications • ‘See Also’ links • Facet-based browsing • Scent stress test for search

  24. Time Reporting System

  25. TRS Categories • IA • Design: Graphic Interface Design • Image Processing • Technical Architecture • Site building • Special Features • Fulfillment Integration • Customer Service Integration • Etracking • QA • Hosting • Project Management

  26. TRS • Designed to provide a central repository of information about consultants weekly billed. • The systems will allow for the user to keep track of their time and for the administrator to view that time in different report formats to help facilitate the payroll and invoicing processes.

  27. TRS • Users are able to enter and edit time information for the current week. • The application is browser based. • Audience • Consultants (Keep track of their time) • Administrators (Payroll or invoicing clients)

  28. TRS • Users are able to • Log into the system and set up their list of current projects. There will be detailed information available for each project to help the consultant to find the right project • Timesheets are dynamically created and the user is able to add, edit or delete their time for the current week. • By clicking on the total button the user can view the total hours/client • Once a timesheet is submitted it becomes locked cannot be edited.

  29. TRS, Administrative Functions • Add new user to the system, as well as edit existing user data. • Add, edit or delete current list of client projects as well as comments related to the project, the targeted completion date, and whether the project is billable or not. • Check to see what time sheets for the previous week have been submitted and what time sheets are outstanding. • System emails all consultants on Sunday nights who have not submitted their timesheets yet.

  30. TRS, Administrative Functions • Review the administrative action flags which will alert the administrator when a project has gone over the scheduled budget.

  31. Reports • Only available to administrators. • Filtering and sorting criteria available through drop down menus. • A list of all hours billed to each project • A list of consultants with outstanding time sheets. (TS not submitted) • A list of hours per consultant for the week, for all employees

  32. Staying on the Same Page … using the project site

  33. Communicating Using the Project Site • project site makes all documents readily available • client-contractor communication center • extranet • password protected • showcase for new clients

  34. A Tool for the Producer • Producer – manages the project site • Needs quick and accurate access to project information • Brings new team members up to speed • Take time to implement and maintain, but worth the effort • For asynchronous communication

  35. Project Site Access • Key members • client and contractor decision-makers • access to contract, financial updates, etc. • Team • includes client, contractor and sub personnel • access to all but financial, legal info • Testers • answer questionnaires, review site • Subcontractors • same access as team

  36. Content of a Project Site • Project site design should be simple, easy to navigate • Site mission • Calendar • high-level view of project • phases, milestones, progress • Schedule • focus on team deadlines • highlights next couple of weeks

  37. Content of a Project Site • chronology • content • site maps • different types • should use software tool • other diagrams, illustrations • visual explorations • client presentations • beta tests • give each test different version number

  38. Content of a Project Site • Contact page • first item in reference section of site • comprehensive • all key personnel, roles, responsibilities • include name, phone, live e-mail link • Resources page • contracts • change by adding addendum

  39. Content of a Project Site • resources page • progress reports • links to weekly reports • invoices • record of time, money spend on site • access to upper management only • running total helpful • pre-existing content • brochures, logos, etc.; if pertain to project

  40. Content of a Project Site • Resources page • other content • site reviews, diagrams, correspondences, etc. • Link to production site • Help page • can’t predict who will visit site, what his or her surfing skill is • provide assistance for what might not be clear

  41. Running a Project Site • use site frequently • only producer should update site • never revise chronology • can add items or change deadlines • writing should be concise, consistent • edit, edit, edit

  42. Running a Project Site • Put up only what client should see • not internal discussions, temp versions • Archive internal communication off-line • Update schedule as necessary • archive old schedules • separate long- and short-term objectives • Key personnel should bookmark site

  43. Improving the Project Site • Project management software • becoming web-based • Thinking out-of-the-box • sometimes old methods better • printer copy • face-to-face meeting • telephone calls key is communication between client and development teams

  44. Communicating is the Key • Proper communication avoids misunderstanding • Causes of poor communication • people from different disciplines • mutual understanding of terminology • personalities • hidden agendas

  45. Communicating is the Key • Causes of poor communication • ineffective meetings • proximity of team members • assumptions • poor infrastructure and support • taking advice of an “expert” • fear and irresponsibility • lack of good communications structure

  46. Effective Communication Systems • Typical communication chain

More Related