Presentation is loading. Please wait.

Presentation is loading. Please wait.

SIP Performance Benchmarking draft-ietf-bmwg-sip-bench-term-02 draft-ietf-bmwg-sip-bench-meth-02 July 24, 2010 Prof. Carol Davids, Illinois Inst. of Tech.

Similar presentations


Presentation on theme: "SIP Performance Benchmarking draft-ietf-bmwg-sip-bench-term-02 draft-ietf-bmwg-sip-bench-meth-02 July 24, 2010 Prof. Carol Davids, Illinois Inst. of Tech."— Presentation transcript:

1 SIP Performance Benchmarking draft-ietf-bmwg-sip-bench-term-02 draft-ietf-bmwg-sip-bench-meth-02 July 24, 2010 Prof. Carol Davids, Illinois Inst. of Tech. Dr. Vijay Gurbani, ALU Scott Poretsky, Allot Communications 1 IETF 78 – Maastricht, BMWG

2 Status Working Group last call in progress Needs reviewers and comments IIT Masters Candidates implemented the methodology using SIPp as the test engine –Two systems under test: Asterisk and Kamailio –Registration with and without authentication –Invite (no media) with and without authentication –Invite (no media) with and without authentication and with forking –Report on results in progress 2 IETF 78 – Maastricht, BMWG

3 Key Changes to Terminology v02 Section 3.1.5: Removed dependency on Overload Working Group activities Section 3.1.11: Changed the name of the metric, “Standing Sessions” to “Standing Sessions Count” to better reflect the fact that the metric represents the number of sessions Added a new definition, Section 3.3.4: Session Duration to align with the Methodology draft. Section 3.4.2: Changed the phrase, “maximum average rate” to the phrase “average maximum rate”, the latter being the intended definition. Additional formatting and editorial changes were also made and referenced on the mailing list. 3 IETF 78 – Maastricht, BMWG

4 TBD Will remove Section 3.4.4 Session Overload Capacity, since we have defined overload to be outside the scope of the document. We define and record failures but do not consider the causes of the failures. We will add a definition of SIP Flooding. 4 IETF 78 – Maastricht, BMWG

5 Key Changes to Methodology v02 Reduced the number of test topologies and changed labeling of servers, figures and figure references throughout. Added a Baseline Performance test. Added units to the Reporting Format 5 IETF 78 – Maastricht, BMWG

6 Next Steps Reviewers and comments please! 6 IETF 78 – Maastricht, BMWG

7 BACKUP 7 IETF 78 – Maastricht, BMWG

8 Summary of the Contents Terminology SIP Benchmarking Terminology provides 4 sets of definitions: –Protocol Components – defines the signaling, media and control planes; sessions with and without associated media, Invite- initiated sessions and Non-Invite initiated sessions –Test Components – defines parts of the test agent –Test Setup Parameters – defines a Session Attempt Rate, Establishment Threshold Time, and other parameters that must be recorded before entering a test cycle –Benchmarks – defines seven test parameters 8 IETF 78 – Maastricht, BMWG

9 Benchmarks 3.4.1. Registration Rate Definition: The maximum number of registrations that can be successfully completed by the DUT/SUT in a given time period. 3.4.2. Session Establishment Rate Definition: The maximum average rate at which the DUT/SUT can successfully establish sessions. 3.4.3. Session Capacity Definition: The maximum number of Established Sessions that can exist simultaneously on the DUT/SUT until Session Attempt Failure occurs. 3.4.4. Session Overload Capacity Definition: The maximum number of Established Sessions that can exist simultaneously on the DUT/SUT until it stops responding to Session Attempts. 3.4.5. Session Establishment Performance Definition: The percentage of Session Attempts that become Established Sessions over the duration of a benchmarking test. 3.4.6. Session Attempt Delay Definition: The average time measured at the Emulated Agent for a Session Attempt to result in an Established Session. 3.4.7 IM Rate Definition: Maximum number of IM messages completed by the DUT/SUT. 9 IETF 78 – Maastricht, BMWG

10 Reporting Format Test Setup SIP Transport Protocol = ____________________ Session Attempt Rate = _____________________ IS Media Attempt Rate = ____________________ Total Sessions Attempted = __________________ Media Streams Per Session = ________________ Associated Media Protocol = _________________ Media Packet Size = ________________________ Media Offered Load = _______________________ Media Session Hold Time = __________________ Establishment Threshold Time = _______________ Loop Detecting Option = _____________________ Forking Option = ___________________________ Number of endpoints request sent to = ________ Type of forking = __________________________ Authentication = ____________________________ 10 Benchmarks for IS Session Capacity = __________________________ Session Overload Capacity = __________________ Session Establishment Rate = _________________ Session Establishment Performance = __________ Session Attempt Delay = _____________________ Session Disconnect Delay = __________________ Benchmarks for NS IM Rate = _______________________________ Registration Rate = _________________________ Re-registration Rate = ________________________ Units are added in v2 IETF 78 – Maastricht, BMWG

11 Scope – DUT/SUT The DUT must be a RFC 3261 capable network equipment. This is referred to as the "Signaling Server". –This may be a Registrar, Redirect Server, Stateless Proxy or Stateful Proxy. A DUT MAY also include a B2BUA, SBC, or P-CSCF functionality. –The DUT MAY be a multi-port SIP-to-switched network gateway implemented as a SIP UAC or UAS –The DUT or SUT MUST NOT be end user equipment. The DUT MAY have an internal SIP ALG, Firewall, and/or a NAT. This is referred to as the "SIP Aware Stateful Firewall.“ The Tester acts as multiple "Emulated Agents" that initiate (or respond) to SIP messages as session endpoints and source (or receive) “Associated Media” for established connections. Terminology defines SIP Control Plane performance benchmarks for black-box measurements of SIP signaling of networking devices –Stress and debug scenarios are not addressed in this work item 11 Signaling SUT IETF 78 – Maastricht, BMWG

12 Scope – Signaling and Media Control signaling is benchmarked Media performance is not benchmarked in this work item It is RECOMMENDED that control plane benchmarks are performed with media present, but this is optional. The SIP INVITE requests MUST always include the SDP body The type of DUT dictates whether the associated media streams traverse the DUT or SUT. Both scenarios are within the scope of this work item. DUT or SUT – Calling UE – Tester Called UE – Tester Signaling Associated Media DUT or SUT – Calling UE – Tester Called UE – Tester Signaling Associated Media Associated Media 12 IETF 78 – Maastricht, BMWG

13 Session Terms 3.1.6. Session Attempt Definition: A SIP Session for which the Emulated Agent has sent the SIP INVITE or SUBSCRIBE NOTIFY and has not yet received a message response from the DUT/SUT. 3.1.7. Established Session Definition: A SIP session for which the Emulated Agent acting as the UE/UA has received a 200OK message from the DUT/SUT. 3.1.8. Invite-initiated Session (IS) Definition: A Session that is created by an exchange of messages in the Signaling Plane, the first of which is a SIP INVITE request. 3.1.9. Non-INVITE-initiated Session (NS) Definition: A session that is created by an exchange of messages in the Signaling Plane that does not include an initial SIP INVITE message. 3.1.10. Session Attempt Failure Definition: A session attempt that does not result in an Established Session. 3.1.11. Standing Sessions Definition: A SIP session that is currently an established session. 13 IETF 78 – Maastricht, BMWG

14 Scope - Scenarios Session Establishment performance is benchmarked Both INVITE and non-INVITE scenarios (such as IM) are addressed Different transport mechanisms -- such as UDP, TCP, SCTP, or TLS -- may be used; –Transport mechanism MUST be noted as a condition of the test as the performance of SIP devices may vary accordingly Looping and forking options are also considered –Impacts processing at SIP proxies REGISTER and INVITE requests may be challenged or remain unchallenged for authentication purpose as this may impact the performance benchmarks. –Any observable performance degradation due to authentication is considered to be of interest to the SIP community 14 IETF 78 – Maastricht, BMWG

15 Scope - Overload SIP Overload is considered within the scope of this work item: –Considerations on how to handle overload are deferred to work progressing in the SIPPING working group. The normal response to an overload stimulus -- sending a 503 response -- is considered inadequate. –Vendors are free to implement their specific overload control behavior as the expected test outcome if it is different from the IETF recommendations. However, such behavior MUST be documented and interpreted appropriately across multiple vendor implementations. This will make it more meaningful to compare the performance of different SIP overload implementations. –This draft now has a dependency on the strategy of the overload work in SIPPING 15 IETF 78 – Maastricht, BMWG

16 Out of Scope Scenarios SIP Control performance benchmarking is the focus of this work item. –Media performance is not benchmarked in this work item –Stress and Steady-State benchmarking is not considered in scope. This could be covered in an Appendix if preferred. Re-INVITE requests are not considered in scope Benchmarking SIP Presence is not considered in scope IMS-specific scenarios are not considered, but test cases can be applied with 3GPP-specific SIP signaling and the P-CSCF as a DUT Session disconnect is not considered in scope –Only session establishment is considered for the performance benchmarks. –Disconnect is a lightweight transaction to release resources for steady- state. –Has no performance benchmark because dependent on INVITE –posted on SIPPING for feedback 16 IETF 78 – Maastricht, BMWG


Download ppt "SIP Performance Benchmarking draft-ietf-bmwg-sip-bench-term-02 draft-ietf-bmwg-sip-bench-meth-02 July 24, 2010 Prof. Carol Davids, Illinois Inst. of Tech."

Similar presentations


Ads by Google