smart grid discussions november 2010 n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Smart Grid Discussions – November 2010 PowerPoint Presentation
Download Presentation
Smart Grid Discussions – November 2010

play fullscreen
1 / 45

Smart Grid Discussions – November 2010

296 Views Download Presentation
Download Presentation

Smart Grid Discussions – November 2010

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Smart Grid Discussions – November 2010 Date: 2010-November-08 Abstract: NIST PAP#2 Report r6 recommended changes Other Smart Grid activities Bruce Kraemer, Marvell

  2. Monday Agenda Item 4.1.12 Smart Grid Meetings Bruce Kraemer, Marvell

  3. Agenda Topics for the Week Action Item • Finalize change suggestions for the NIST PAP#2 Report • Information Items • SGIP update • OpenSG update • P2030 update • ITU Focus Group • March Tutorial topics/speakers Bruce Kraemer, Marvell

  4. July 28, 2010 Draft 0.5 August 4, 2010 Call for Input to Section 6 September 15, 2010 End of draft 0.5 review period September 16, 2010 SGIP face-to-face, St Louis Tentative PAP 2 meeting NIST Timeline re-confirmed Nov 4 September 30, 2010 Release of draft 0.6 October 29, 2010 End of draft 0.6 review period November 4, 2010 OpenSG meeting, Miami Tentative PAP 2 meeting SGIP face-to-face, Chicago PAP 2 meeting December 3, 2010 Release of Version 1 Bruce Kraemer, Marvell

  5. PAP#2 Report was updated Oct 1 • http://collaborate.nist.gov/twiki-sggrid/pub/SmartGrid/PAP02Wireless/NIST_Priority_Action_Plan_2_r06.pdf Bruce Kraemer, Marvell

  6. NIST PAP#2 Report v6 – Section 4 4.1 Technology Descriptor Headings To be able to describe wireless technology a set of characteristics were identified and organized into logical groups. The group titles are listed below. • 1. Link Availability • 2. Data/Media Type Supported • 3. Coverage Area • 4. Mobility • 5. Data Rates • 6. RF Utilization • 7. Data Frames & Packets • 8. Link Quality Optimization • 9. Radio Performance Measurement & Management • 10. Power Management • 11. Connection Topologies • 12. Connection Management • 13. QoS & Traffic Prioritization • 14. Location Characterization • 15. Security & Security Management • 16. Radio Environment • 17. Intra-technology Coexistence • 18. Inter-technology Coexistence • 19. Unique Device Identification • 20. Technology Specification Source • 21. Deployment Domain Characterization • 22. Exclusions Bruce Kraemer, Marvell

  7. IEEE 802 contributed a number of suggestions on how to change the NIST PAP#2 Report r6. These were contained in documents 1210 and 1209.https://mentor.ieee.org/802.11/dcn/10/11-10-1209-00-0000-comment-set-1-on-pap-2-report-r6.dochttps://mentor.ieee.org/802.11/dcn/10/11-10-1210-01-0000-comment-set-2-on-pap-2-report-r6.ppt Bruce Kraemer, Marvell

  8. Material for this meeting Section 4 edited Section 4 edited Bruce Kraemer, Marvell

  9. Comment #01 • Section 4.2.1.3 talks about Coverage Area. It is important to discuss coverage in conjunction with data rates and link margin for example, in order to avoid associations between inconsistent pieces of information, e.g., citing the largest coverage area achievable by a given technology along with the highest data rate achievable by the technology is incorrect – generally the two have a reverse relationship and the highest coverage is achievable at the lowest data rate. • Agreed to text change: • Add the following text at the end of Section 4.2.1.3: When comparing coverage areas between different technologies, it is important to take into account the link budgets used in the coverage computation. Note that the largest coverage area achievable by a specific technology typically requires transmission at the lowest data rate used by that technology. Bruce Kraemer, Marvell

  10. Comment #02a • Section 4.2.1.4 talks about Mobility. It would be useful to mention the data rates achievable at various mobility levels to avoid assumptions that mobile devices can communicate at the highest data rates used by a specific technology. • Agreed to text change: • Add the following text at the end of Section 4.2.1.4: Comparisons between the capabilities of different mobile technologies have to take into account the maximum data rate achievable at each mobility level -- mobile devices may not be able to communicate at the highest available data rates when moving at high speeds. Bruce Kraemer, Marvell

  11. Comment #03 • Section 4.2.1.5 talks about Data Rates. • Agreed text change: • Add the following text at the end of Section 4.2.1.5: Additional factors to consider when discussing data rates: • Throughput must be considered in conjunction with packet size, coverage range and rate of mobility (if any). • It is important to distinguish between unicast, multicast and broadcast rates, as they may not be the same for a given wireless technology. • Throughput depends on medium access scheduling, including the capability to provide block transmissions (whereby multiple data packets can be sent in succession with minimum or no individual medium access operations per packet except before the first packet is sent), and/or block acknowledgements (whereby a single acknowledgement packet can acknowledge multiple preceding data packets). The capability and flexibility to optimize block transmissions and acknowledgements can have a significant effect on GoodPut. • The use of rate adaptation mechanisms, where the data rate on a link is modified when the quality of the link changes. Bruce Kraemer, Marvell

  12. Add these definitions to Section 2.2 Broadcast • Broadcast is a form of message transmission where a message is sent from a single source to all potential receiving nodes. Multicast • Multicast is a form of message transmission where a message is sent from a single source to a subset of all potential receiving nodes. (The mechanism for selecting the members of the subset is not part of this definition.) Unicast • Unicast is a form of message transmission where a message is sent from a single source is sent to a single receiving node. Bruce Kraemer, Marvell

  13. Comment #04 • Section 4.2.1.6 talks about RF utilization. • Agreed text change: • Add the following text at the end of Section 4.2.1.6: • Consider the power level regulations for the different channels used by a particular technology. • Consider the impact of Dynamic Frequency Selection (DFS) regulations on the channels used by a particular technology, e.g., certain UNII channels are subject to DFS regulation which requires wireless devices to change channel when they detect the use of radar on their current channel. Bruce Kraemer, Marvell

  14. Comment #05 • Section 4.2.1.7 talks about Data Frames and Packets. It is important to consider frame duration in conjunction with data rate and size of the frame. Also, we need to consider multicast and broadcast frames in addition to unicast frames. • Agreed text change: • Modify item “a)” in Section 4.2.1.7 as follows: • What is the maximum frame duration for a unicast, multicast and broadcast frame respectively, and what are the corresponding frame size and data rate at which each type of frame was sent? • Modify item “b)” in Section 4.2.1.7 as follows: • What is the maximum packet size that can be sent in one unicast, multicast and broadcast radio frame respectively? • Modify item “c)” in Section 4.2.1.7 as follows: • Does the radio system support segmentation of unicast, multicast and broadcast packets respectively, when the payload size exceeds the capacity of one radio frame? Bruce Kraemer, Marvell

  15. Comment #06 • Section 4.2.2.4 talks about Connection Topologies. The Bus and Ring topology need to be removed, they are not wireless topologies. One way to characterize wireless topologies is as single hop and multi-hop (statically configured or mesh), and wireless links as point-to-point, point-to-multipoint, and omnidirectional. We need to add figures that correspond to the text we end up with. • Agreed text change: • Remove the Bus and Ring figures • Replace the current text in Section 4.2.2.4 with the following: Wireless network topologies can be divided into single hop and multi-hop, where a multi-hop topology can be statically configured, or can be dynamic and self-forming, e.g., a mesh. A wireless link can be point-to-point, point-to-multipoint, or broadcast. • Add the definitions on the following 4 slides to Section 2.2 Bruce Kraemer, Marvell

  16. Hop Definitions • Proposed PAP2 Guidelines Document Definitions • Hop: The term hop is used to signify a link between a pair of devices that a frame or packet needs to traverse to reach one device from the other. • Single-Hop Network: A single-hop network is one in which devices can only communicate with each other directly, e.g., over a single link (hop), and do not have the capability to forward traffic on each other’s behalf. • Multi-Hop Network: A multi-hop network is one in which devices have the capability to forward traffic on each other’s behalf and can thus communicate along paths composed of multiple links (hops). Bruce Kraemer, Marvell

  17. Configuring Definition • Statically Configured Multi-Hop Network: A multi-hop network can be statically configured, such that each node’s forwarding decisions are dictated by configuration. • Dynamic and Self-Configuring Multi-Hop Network: A multi-hop network can be dynamic and self-configuring, such that network devices have the ability to discover (multi-hop) forwarding paths in the network and make their own forwarding decisions based on various pre-configured constraints and requirements, e.g., lowest delay or highest throughput. Bruce Kraemer, Marvell

  18. MESH Definition • Mesh Network: A mesh network is a dynamic self-configuring network composed of devices that can forward traffic on each other’s behalf, have the ability to discover (multi-hop) forwarding paths in the network and make their own forwarding decisions based on various pre-configured constraints and requirements, e.g., lowest delay or highest throughput. Bruce Kraemer, Marvell

  19. Comment #07 • Section 4.2.2.5 talks about Connection Management. The section needs to mention what aspects of “connection management” can be used to compare different wireless technologies. For example, we can evaluate the latency to join a network, available security mechanisms employed when joining a network, and overhead to join the network (number of control packets exchanged). Perhaps section titles such as “Network Participation Mechanisms” or “Joining the Network” are more descriptive of the content of this section. Bruce Kraemer, Marvell

  20. Comment 07b Add the following text at the end of Section 4.2.2.5: • It is important to evaluate: • the time it takes for a device to join a particular network, and the overhead required to do so • the time and overhead required to rejoin the network when a device becomes disconnected from the network • the overhead required to maintain membership in the network after the initial admission into the network • the overhead associated with optimizing connectivity, e.g., in mesh-based topologies. Bruce Kraemer, Marvell

  21. Comment #08 • Section 4.2.3.2 talks about Location Characterization. It seems like many of the techniques applicable to this section are not technology-specific but implementation-specific and as such can be incorporated across different wireless technologies even if they are not currently incorporated into the products of a specific wireless technology. It would be helpful to make the distinction between technology-specific properties and product-specific properties in the text. • Agreed text change: • Add the following text at the end of Section 4.2.3.2: • It is important to distinguish between technology-specific mechanisms for location characterization and mechanisms that are applicable across technologies or communication topologies, which can easily be added to products that may not currently support them. Bruce Kraemer, Marvell

  22. Comment #09 • A category that is missing from Section 4 is one that characterizes the deployment complexity of each technology. • Agreed text change: Add the following text after Section 4.2.4.1: • 4.2.5 Group 22: Deployment Complexity • It is important to evaluate the complexity of: • installation and maintenance of a given wireless system • integration with other, possibly existing, networks • expansion of the wireless network coverage over time. Bruce Kraemer, Marvell

  23. General Comment #10 • It would be helpful to have some tables and text summarizing the information in Section 5, and to move a lot of the discussions/derivations to an appendix. Otherwise, the message/conclusions/recommendations get lost in the text. Bruce Kraemer, Marvell

  24. General Comment #11 Section 4.2.1.2 (p. 24) talks about voice and video traffic over the smart grid. We need more use cases motivating why we would want to have voice and video traffic over the smart grid network. The current set of use cases supplied by OpenSG does not currently contain this service. The only video example given in the text is one of surveillance of affected outage areas. It would seem that voice and video might be of lower priority during outages, e.g., caused by disasters or weather-related events, since the network would require a high degree of availability for its regular functions. In addition, surveillance is generally part of the public safety infrastructure and there is spectrum allocated for such use so I am not convinced that we should be discussing this kind of application in the context of the smart grid. • Applications such as voice and video have requirements that even broadband network providers are struggling with (wireless and landline) and making them part of the smart grid infrastructure requires significant justification. Bruce Kraemer, Marvell

  25. General Comment #12 • Link Availability in Section 4.2.1.1 does not appear to be consistently calculated for the various candidate various radio technologies, nor did majority of the technology candidates describe the method used to calculate availability. • The current description of the characteristic does not match the calculation. • Both of these issues need to be resolved before progressing to completion of Sections 6 & 7. • “The technology “Operating Point” chosen is presumably chosen recognizing that achieving a low failure rate is desirable.” • Agreed text change: Change this sentence to • “The technology “Operating Point” is chosen to achieve a low failure rate and is an outcome of deployment flexibility & strategy.” Bruce Kraemer, Marvell

  26. Comment #13 Para 2 Recommended change • Reword the preface to incorporate the idea that SG application requirements evolve over time, yielding to experience rather than remain locked in 1989 or 1999 or 2009 economics. • Smart Grid application requirements must be defined with enough specificity to quantitatively define communications traffic and levels of performance over the lifetime of the applications.  Applications requirements must be combined with as complete a set of management and security requirements for the life-cycle of the equipment.  The decisions to apply wireless for any given set of applications can then be based on expected performance and costs over the projected useful lifetimes of the spectrum and equipment.  Bruce Kraemer, Marvell

  27. Agenda Topics for the Week Action Item • Finalize change suggestions for the NIST PAP#2 Report • Information Items • SGIP update • OpenSG update • P2030 update • ITU Focus Group • March Tutorial topics/speakers Bruce Kraemer, Marvell

  28. Catalog of Standards Mark Klerer SGIP Plenary Vice Chair Footer for this presentation 8/30/2014

  29. Catalog of Standards (Status of Work in Progress) Plenary leadership team working in conjunction with SGAC and other SGIP working groups • Proposed Scope of the Standards Catalog • Standards and guides recognized as relevant for enabling SG capabilities • Proposed Objectives of the Standards Catalog • Explain value & purpose of the catalog for SG community • Influential, but independent of NIST/FERC decision-making • Characterize the various specification organizations with respect to their processes in developing their specifications • Provide an annotated resource that identifies standards created by recognized SSOs and/or industry consortia that are relevantto Smart Grid applications • Identify functional areas of smart grid where each standard is appropriate (draw on SGAC work)

  30. Catalog of Standards: Process & Structure • Process • NIST Framework and Roadmap for SG Interoperability v1.0 identifies many standards to consider • Additional standards can be identified to the SGIP Administrator by any SGIP memberfor potential inclusion in catalog • Relevance and importance evaluated by appropriate SGIP working group (e.g. DEWG, PAP, etc) and consensus developed • 75% approval by SGIP membership required prior to SGIPGB approval for inclusion in the catalog • Standards included in the catalog may be deprecated from further use to changes in technology or needs by following the same process. • Catalog Structure • Entries in catalog to be structured based on application domain defined in the Framework and further classified by GWAC stack • Relationship to NIST and FERC lists • Standards Catalog strives for accurate characterization and relevance to the smart grid community, and avoids recommendation • Standards Catalog expected to be a larger compilation which can inform NIST and FERC in their decision processes

  31. Testing & Certification Committee Rik Drummond SGTCC Chair Footer for this presentation 8/30/2014

  32. Purpose • Establish a Testing and Certification Framework for the Smart Grid • Establish a brand called ‘Interoperability’ that has a consistent meaning across the Smart Grid for the buyers of interoperable products. • At this time a set of products deemed interoperable may be interoperable with a 80%, 95%, 99%, or 100% confidence level. Thus to say a product is interoperable has little current meaning in the market place as many purchasing organizations have found.

  33. SGTCC Monthly Quad Chart Deliverables Activities and Accomplishments Upcoming Key Milestones and Activities Issues, Concerns, and Help Needed • D3 – Interoperability Process Reference Manual (IPRM) is being finalized for SGIP review. • Interoperability Maturity Assessment Tool completed • D3 – Interoperability Process Reference Manual (IPRM) completed 1st review and comment period during St. Louis meetings; comment resolution and final editing remains in progress • Began piloting IPRM with several Interoperability Testing and Certification Authorities (ITCA) who have expressed willingness to cooperate and participate in assessing their organizations against the IPRM recommendations. • Prepared draft ITCA audit process document and checklist in preparation for ITCA reviews • Launched discussion with accreditation bodies for future independent ITCA reviews • Presentation on SGTCC framework and plan to the SGIP on October 29 to build awareness and support for the process • Completing 2-3 ITCA reviews by late November • Updates to the IPRM based on experience gathered during the ITCA review process, and revision/release in early January • Engaging with the CSWG testing sub-team to coordinate security related testing issues • Obtaining timely cooperation from the ITCAs to participate in the review process with the TCC, and accelerating their commitment to adopt and enact the SGTCC recommendations in their operations • Engaging end users to gain their commitment towards requiring IPRM conformance for ITCAs certifying the products that they purchase Footer for this presentation October 2010 Activities - PMO Monthly Report

  34. Definitions ITCA – Interoperability Testing and Certification Authority Framework Manual - IPRM – Interoperability Program Reference Manual ISO 65 - General Requirements for Bodies Operating Product Certification Systems ISO 17025 – General Requirements for the Competence of Testing and Calibration Laboratories SGTCC Interoperability Test Construction Best Practices – Lists of best practices not covered in ISO 65 and ISO 17025 SGTCC/CSWG Cyber Security Testing Best/Standard Practices –List of best practices not covered in ISO 65 and ISO 17025 Interoperability Maturity Assessment Model – looking for IOP products based on standards NOW.

  35. General Structure of the Framework Manual Framework Manual Introduction, Responsibilities, Rationale, Usage and Checklists Best/Standard Practices for Cyber Security Test Construction Best Practices for IOP Test Construction ISO Guide 65 ISO Guide 17025 2011 Transition Bootstrap Support Plan for ITCAs Evaluation Checklist for ITCA Delta to Manual Footer for this presentation

  36. ISO Guide 65 Overview • ISO Guide 65 contains the requirements necessary for an organization to demonstrate competence to perform certification activities related to the standards or specifications stated in the certification • ISO Guide 65 criteria include: • Technical competence • Certifying personnel criteria; accessibility of certification test processes; assessment fairness and integrity and others • Management systems • Quality management processes, technical dispute resolution processes • Lab qualification criteria, lists of certified products, record control, ongoing certification maintenance and withdrawal process • ISO Guide 65 conformance demonstrates a robust, thorough and meaningful certification program • Implements a monitoring program for IOP products in the field to ensure IOP remains

  37. ISO 17025 Overview • ISO 17025 contains all requirements that laboratories need to demonstrate that they • operate a management system, • are technically competent, • are able to generate technically valid results. • ISO 17025 is the most widely accepted and used standard for the operation of test laboratories • ISO 17025 applies to any testing laboratory operation (1st, 2nd or 3rd party), with many 3rd party labs formally accredited • It facilitates acceptance of test results from accredited laboratories and serves as the requirements that formal accreditation bodies apply in assessing laboratories.

  38. Best Practices for IOP test construction examples • Test Suite Specification of a standard used for interoperability or conformance testing shall be managed in the same way as the standard they are derived from. • IOP Certification test reports shall fully describe the test methodology used including the justification for statistical or deterministic testing. • A certified interoperable product set shall also be conformant to the standard or profile of the standard. • The only means to ensure interoperability among products is to perform a full matrix test.

  39. 2011 Transition Bootstrap year • SGTCC, with NIST will help bootstrap the process by offering tutorial help in 2011 to the first few committed ITCAs. • Preliminary review of implementation of ISO 65 and ISO 17025 implemented processes. • Review and analysis of interoperability test construction best practices. • Other general guidance. • Maintain a list for the industry showing ITCAs in the process of implementing the Manual.

  40. 2011 Transition Bootstrap year • SGTCC, with NIST will help bootstrap the process by offering tutorial help in 2011 to the first few committed ITCAs. • Preliminary review of implementation of ISO 65 and ISO 17025 implemented processes. • Review and analysis of interoperability test construction best practices. • Other general guidance. • Maintain a list for the industry showing ITCAs in the process of implementing the Manual.

  41. 2012 and Beyond ITCAs will be using Test Labs using ISO 17025, and ISO 65 standards and be accredited by the existing formal accreditation organizations. SGTCC will maintain lists of SGIP Approved ITCAs (those implementing the Manual) for a standard and demonstrating the production of interoperable products. The products of the standard will be monitored for interoperability in the field by ITCA and secondarily by SGTCC Accreditation Bodies (e.g., NVLAP and ANSI) will periodically audit test labs and certification bodies using the Manual as guidance and re-accredit them. SGTCC will subsequently update the ‘SGIP ITCA Approved List ’. Note many Test lab now use ISO 17025, but not the IOP best practices. Also many ITCAs do not use ISO 65.

  42. Next Steps and Your Response • Receive SGIP consensus for Manual / Framework • Each SGIP member MUST REQUIRE the purchase of interoperable products to initiate the monetary incentive for many of the ITCAs to upgrade to the Manual / Framework. • Note: this is an issue about wide scale interoperability across the smart grid. Having only a percentage requiring interop products will in many ways leave us in our current state. • SGTCC will offer two Webinars in late November and early December to address questions and concerns. To be announced.

  43. GB Election Timeline – Even Stakeholders, 2010

  44. UPCOMING 2010 PLENARY EVENTS 30 Nov – 3 Dec: Grid-Interop, Chicago See http://www.grid-interop.com/2010/#agenda for detailed agenda

  45. 2011 Plenary Meeting Schedule