1 / 13

DITA code reviews

DITA code reviews. Presenters: Megan Bock, Shannon Rouiller, Jenifer Schlotfeldt. Agenda. What is a code review? Why do we do code reviews? What do we find in code reviews? Using a CSS to identify incorrect markup What is the process for code review? Tracking code reviews

onan
Télécharger la présentation

DITA code reviews

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. DITA code reviews Presenters: Megan Bock, Shannon Rouiller, Jenifer Schlotfeldt

  2. Agenda • What is a code review? • Why do we do code reviews? • What do we find in code reviews? • Using a CSS to identify incorrect markup • What is the process for code review? • Tracking code reviews • Code review demo DITA Code Reviews

  3. What is a code review? • It’s a learning process to fix coding errors early in the authoring process (or just after migration) • DITA source files are reviewed by a team (the author, a DITA advocate, and an editor) • Not a comprehensive review of every topic • 3-10 representative topics (concept, task, reference) • 30 minutes to 1 hour • Writer is expected to implement review comments throughout their topic set DITA Code Reviews

  4. Why do we do code reviews? • To help each other learn correct DITA markup • To ensure consistent markup in your library, which results in: • Consistent output • Support for topic and map reuse • Simpler maintenance • Faster and smoother adoption of new tools and features • To identify possible requirements for DITA, our tooling, internal documentation, or our highlighting guidelines based on real usage DITA Code Reviews

  5. What do we find in code reviews? • Element abuse • Unordered lists instead of parameter lists or definition lists • Visual elements (typically bold or italic) instead of semantic elements • Ordered or unordered lists instead of step markup (steps, substeps, and choices) • Code phrase and code block elements for random monospaced output • Pseudo-heads and pseudo-notes • Incorrect markup left by migration • Element neglect • Missing short descriptions • Scattered index entries • Incorrect structure of step content • Simulated menu cascades DITA Code Reviews

  6. Semantic elements that are new in DITA • apiname – Use for API names such as a Java class name or method name. • wintitle – Use for title text that appears at the top of a window or dialog, and applies to wizard titles, wizard page titles, and pane titles. • menucascade – Use to document a series of menu choices, or to show any choice on a menu from which the user needs to choose. • note type – Use to expand on or call attention to a particular point. • term – Use to identify words that represent extended definitions or explanations. • shortdesc – Use to represent the purpose or theme of the topic. This information is displayed as a link preview and for searching. • Message markup (msgblock, msgnum, msgph) • Step markup (step info, result, example) DITA Code Reviews

  7. findBadTags.css DITA Code Reviews

  8. findBadTags.css DITA Code Reviews

  9. Process example • Optional: Writer applies a special CSS (findBadTags) to check for improper tags (mostly highlighting). • Writer has the topics edited: • Writer submits a selection of topics to the editor for an edit. • Editor does a technical or copy edit and checks the output for obvious incorrect formatting and tagging. • Writer incorporates the edits and submits the revised topics for a code review. • The DITA advocate, editor, and writer meet, and the DITA advocate and editor review tagging and provide input to the writer about best practices. Comments are documented. DITA Code Reviews

  10. How does the process vary? • Timing of code reviews: the earlier the better. • Voluntary code reviews vs. mandatory code reviews. • Informal vs. formal code reviews. • People in the code review: Whole writing team vs. just one writer. • Files to submit: DITA (including conref source files, DITA maps, and art files) and output files (for example, Eclipse plug-in). • Special CSS: (findBadTags) to check for improper tags (mostly highlighting). DITA Code Reviews

  11. Ways to track code reviews Files Comments DITA Code Reviews

  12. Ways to track code reviews (continued) DITA Code Reviews

  13. Code Review Demo • Sample files DITA Code Reviews

More Related