Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Proposal for BENCHMARKING SIP NETWORKING DEVICES draft-poretsky-sip-bench-term-01.txt draft-poretsky-sip-bench-meth-00.txt Co-authors are Scott Poretsky.

Similar presentations


Presentation on theme: "1 Proposal for BENCHMARKING SIP NETWORKING DEVICES draft-poretsky-sip-bench-term-01.txt draft-poretsky-sip-bench-meth-00.txt Co-authors are Scott Poretsky."— Presentation transcript:

1 1 Proposal for BENCHMARKING SIP NETWORKING DEVICES draft-poretsky-sip-bench-term-01.txt draft-poretsky-sip-bench-meth-00.txt Co-authors are Scott Poretsky of Reef Point Systems Vijay Gurbani of Lucent Technologies Carol Davids of IIT VoIP Lab 66th IETF Meeting – Montreal

2 2 Motivation Service Providers are now planning VoIP and Multimedia network deployments using the IETF developed Session Initiation Protocol (SIP). The mix of SIP signaling and media functions has produced inconsistencies in vendor reported performance metrics and has caused confusion in the operational community. (Terminology) SIP allows a wide range of configuration and operational conditions that can influence performance benchmark measurements. (Methodology)

3 3 Relevance to BMWG -----Original Message----- From: Romascanu, Dan (Dan) [mailto:dromasca@avaya.com] Sent: Sunday, June 25, 2006 6:00 AM I believe that the scope of the 'SIP Performance Metrics' draft is within the scope of what bmwg is doing for a while, quite successfully, some say. On a more 'philosophical plan', there is nothing that says that the IETF work must strictly deal with defining the bits in the Internet Protocols - see http://www.ietf.org/internet-drafts/draft-hoffman-taobis-08.txt. And in any case, measuring how a protocol or a device implementing a protocol behaves can be considered also 'DIRECTLY related to protocol development'. http://www.ietf.org/internet-drafts/draft-hoffman-taobis-08.txt -----Original Message----- From: nahum@watson.ibm.com [mailto:nahum@watson.ibm.com] Sent: Friday, May 26, 2006 2:51 PM SPEC wants to develop and distribute common code for benchmarking, as is done with SPECWeb a SPECJAppServer. That code can and should use the standardized peformance definitions agreed to by SIPPING and/or BMWG.

4 4 Industry Collaboration BMWG develops standard to benchmark SIP networking device performance SIPPING WG develops standard to benchmark end-to-end SIP application performance SPEC to develop industry-available test code for SIP benchmarking in accordance with IETF’s BMWG and SIPPING standards. -----Original Message----- From: Poretsky, Scott Sent: Thursday, June 22, 2006 8:00 PM To: 'Malas, Daryl'; acmorton@att.com; gonzalo.camarillo@ericsson.com; mary.barnes@nortel.com Cc: vkg@lucent.com; Poretsky, Scott Subject: RE: (BMWG/SippingWG) SIP performance metrics Yes Daryl. I absolutely agree. The item posted to BMWG focuses on single DUT benchmarking of SIP performance. Your work in SIPPING is focused on end-to-end application benchmarking. It would be great (and I would even say a requirement) that the Terminologies for these two work items remain consistent with each other.

5 5 Scope…So Far Performance benchmark metrics for black-box benchmarking of SIP networking devices Terminology and Methodology documents Benchmark SIP –Control –Media  Can be VoIP, Video, IM, Presence, etc. –Relationship between the performance of control and media –Various SIP Transport  TCP and UDP

6 6 Scope…Being Considered Transport with TLS? SCTP? Benchmark impact of NAT? TURN, STUN? Expand Tester to be Emulated Agent _and_ Emulated Server? What is a SUT? Include Relay, Media Gateway, SASF?

7 7 Terminology 3. Term definitions..............................................5 3.1 Test Components...........................................5 3.2 Test Setup Parameters.....................................9 3.3 Benchmarks.............................................…13 3.3.1 Registration Rate....................................13 3.3.2 Successful Session Rate..............................13 3.3.3 Successful Session Attempt Rate......................14 3.3.4 Standing Session Capacity............................14 3.3.5 Session Completion Rate..............................15 3.3.6 Busy Hour Session Connects (BHSC)....................15 3.3.7 Busy Hour Session Attempts (BHSA)....................16 3.3.8 Session Setup Delay..................................16 3.3.9 Session Teardown Delay...............................17 3.3.10 Standing Sessions...................................17 3.3.11 IM Rate.............................................18 3.3.12 Presence Rate.......................................18 Issue 1: Terms = SIP Session and Associated Media, SIP Server

8 8 Methodology Test Cases –> Registration Rate –> Session Rate –> Session Rate with Loop Detection –> Enabled Session Rate with Forking –> Session Rate with Forking and Loop Detection –> Session Rate with Media –> IM Rate –> Presence Rate –> Session Attempt Rate –> Session Capacity –> Session Attempt Rate with Media –> Session Capacity with Media (Open Issues) ISSUE 1: Is a SUT required for benchmarking the associated media? Media does not pass through Proxy Server but would pass through Firewall/NAT. ISSUE 2: Add test cases to benchmark forwarding performance specific to NAT’ing with SIP.

9 9 Next Steps Work Item Discussion –Is this work item of interest to the BMWG? –Input for Scope of Work? –Thoughts on collaboration with SIPPING WG and SPEC? Document Discussion –Feedback on selected terminology? –Ideas for additional methodology test cases? –Any other comments for the documents? Author Actions –Define Scope of Work –Resolve identified issues for methodology –Submit methodology –00 consistent with BMWG terminology and work in SIPPING


Download ppt "1 Proposal for BENCHMARKING SIP NETWORKING DEVICES draft-poretsky-sip-bench-term-01.txt draft-poretsky-sip-bench-meth-00.txt Co-authors are Scott Poretsky."

Similar presentations


Ads by Google