Presentation is loading. Please wait.

Presentation is loading. Please wait.

Online Music Store. MSE Project Presentation III

Similar presentations


Presentation on theme: "Online Music Store. MSE Project Presentation III"— Presentation transcript:

1 Online Music Store. MSE Project Presentation III
Online Music Store MSE Project Presentation III Presented by: Reshma Sawant Major Professor: Dr. Daniel Andresen /11/08

2 Phase III Presentation Outline
Project Overview Brief Review of Phases Action Items from Phase II Implementation/Demo Assessment Evaluation Project Evaluation Lessons learned

3 Project Overview The objective of this project is to design and develop an Online Music Store. Target: Public Users Product: Media for Music User Types: User, Administrator Functionalities for Users: Browsing, searching, buying products, getting song recommendations, managing personal account Functionalities for Administrator: Manage Catalog Details, Manage Orders, Manage Shopping Cart

4 Review of Phases Phase I: Phase II: Phase III (Current):
Requirement Specifications Phase II: Designed Web Pages Created Test Plan Phase III (Current): Coding Testing and Analysis

5 Action Items from Phase II
Correct multiplicities in Class Diagram Multiplicity between ShoppingCart Class and CartItem Class should be 1..*

6 Class Diagram

7 Action Items from Phase II
2) Revise SLOC count and Project Duration Included in Project Evaluation

8 Implementation & Demo Technologies Used:
IDE – Microsoft Visual Studio 2005 Technology - ASP.NET 2.0 Language – C# Database – SQL Server 2005

9 Assessment Evaluation
Manual Testing - To ensure the correctness of various parts of code Test Case # Description Results/ Comments USER T-01 System Register Passed T-02 System Login T-03 Add to Cart T-04 Edit Shopping Cart T-05 Place Order ADMINISTRATOR T-06 Create and Delete product from Category T-07 Create and Delete Category from a Genre T-18 Create and Delete Genre from Catalog T-09 Manage Orders T-10 Manage Shopping Carts

10 Assessment Evaluation
E.g. Register Web Page for User E.g. Edit Shopping Cart Test Unit Test Case Result btnSignup An empty requirement field (Username, password, confirm password, , Security question and Security answer) System prompts user with a message “All fields are required. Please try again”. Username already in use with other existing users. System prompts user to enter the username with a message “Please enter a different username” Password and Confirm password fields do not match. System prompts user to enter the password with a message “The Password and Confirmation Password must match”. All valid requirement fields entered System redirected the user to the secure Login Web page. Test Unit Test Case Result btnUpdate, btnDelete Negative input number or input other than integer number entered in “Quantity” field System prompts the user with the message “Please enter a valid number”. Valid Positive number entered in “Quantity” field System updates the product quantity and displays the message “Your shopping cart was successfully updated” or “Item successfully deleted”

11 Assessment Evaluation
Performance Testing Goal: Determine load in terms of concurrent users and requests Determine Response Time – time between the request being initiated for a Web Page to time taken for it to be completely displayed on a user’s browser Tool Used – JMeter ( Inputs to JMeter: Number of Users Ramp-up period – time (sec) to load the full number of users chosen Loop Count - how many times to repeat the test E.g. Users = 10, Loop-Count = 20, Ramp-up period = 5 sec => 10 Users will be loaded in 5 sec with total requests = 200 (10*20)

12 Assessment Evaluation Performance Testing Factors
Load Type Peak Load – maximum number of users and requests loaded in short duration (e.g. 5 sec). Sustained Load – maximum users and requests loaded for longer period (e.g. 5 mins). Connection Wireless Connection at 54.0 Mbps LAN Connection at Mbps Web pages Tested HTML Page (Login Web Page) Database Intensive Page (Home Page) Business Logic Page (Shopping Cart Page)

13 Assessment Evaluation Performance Testing Environmental Set-up
Machine Configuration Operating System – Windows XP Professional Memory – 1GB RAM 100GB HardDisk Intel Pentium M Processor 1.7 GHz

14 Avg. Response Time (ms) for Wireless Avg. Response Time (ms) for LAN
Assessment Evaluation Home Page [ Peak Load at Wireless (54 Mbps) vs. LAN Connection (100 Mbps) Note Loop-Count constant at 20,000 Ramp-up period of 5 sec Users – 200, 600, 800, 1000 Observations Response Time increases linearly with number of users for both Wireless and LAN Max no.of users handled by the system before it becomes saturated = 1000 Response Time is less for LAN due to better bandwidth. Users Loop Count Ramp-up period (sec) Avg. Response Time (ms) for Wireless Avg. Response Time (ms) for LAN 200 20000 5 8354 7400 600 22538 21700 800 29567 28600 1000 38603 35390

15 Assessment Evaluation
Home Page [ Constant Users vs. Constant Loop-Count for Wireless Connection Users Constant at Loop-Count Constant at 20,000 Loop-Count increased up to Users – 200, 600, 800, 1000

16 Assessment Evaluation
Home Page [ Observations Response Time increases rapidly with number of users but not very much when the users are kept constant and only loop-count is increased. Reason: If the number of users is kept constant and only the loop-count is increased, the number of requests/sec handled by the server remains constant for every increase in the loop count. If the users are increased and loop count is kept constant, the requests/sec handled by the server increases with increasing users, but the number of executions remain constant and hence the longer response time.

17 Assessment Evaluation
Comparison of Response Times of all 3 WebPages at Wireless Connection of 54.0Mbps Note Loop-Count constant at 20,000 Ramp-up period of 5 sec Users – 200, 600, 800, 1000 Observations Response Time increases more for Home Page as compared to Login and Shopping Cart Page Lowest Response Time for Login Page as no database requests are submitted by the user Moderate Response Time for Shopping Cart page because there are more computations Response Time for Shopping Cart Page is approx. 28% more on an average than for Login Page Response Time for Home Page is approx. 246% more on an average than for Login Page Avg. Response Time (ms) for Login Page Avg. Response Time (ms) for Shopping Cart Page Avg. Response Time (ms) for Home Page 1900 2500 8354 7439 7700 22538 8500 10800 29567 13000 15400 38603

18 Assessment Evaluation
Home Page [ External Factors affecting Response Time Varying Network Bandwidth Limited System Hardware Resources (CPU, RAM, Disks) and Configuration JMeter Tests and Server running on the same machine

19 Assessment Evaluation
Summary For Peak Load Users – 200, 600, 800, 1000 Loop-Count constant at 20,000 Ramp-up period = 5 sec Response Time increases rapidly with number of users but not very much when the users are kept constant and only loop-count is increased. Response Time is highest for Home page, Intermediate for Shopping Cart Page and Lowest for Login Page Wireless vs. LAN Login Page Wireless takes on an average 9.5% more Response Time than LAN Shopping Cart Page Wireless takes on an average 6.8% more Response Time than LAN Home Page Wireless takes on an average 6.6% more Response Time than LAN

20 Average Response Time (ms)
Assessment Evaluation Login Page [ For Sustained Load at Wireless Connection Users Loop Count Ramp-up period (sec) Average Response Time (ms) 800 16000 300 10335

21 Project Evaluation Project Duration (actual)
Phase I = 86 hours Phase II = hours Phase III = hours Total = 531 hours Project Duration (in Months) Estimated at the end of Phase II = 6.5 Months Actual = 7.5 Months

22 Project Evaluation Category BreakDown Research = 38.5 hours
Design = 37 hours Coding = hours Testing = 32 hours Documentation = 118 hours Total = 531 hours

23 Project Evaluation SLOC Count (Actual) – LocMetrics Tool ( C# Code (Including C# auto-generated code) = 2757 SQL Code = 540 XML Code = 86 CSS Code = 412 Total = 3795 SLOC Count (Estimated) At the end of Phase II – (Based on prototype design in phase I)

24 Project Experience Lessons Learned: New technology
Use of various tools for designing and testing – Visual Studio 2005, JMeter, LocMetrics Working with UML and Class Diagrams Entire life cycle of the project– requirement gathering, Design, Coding, Testing and Documentation Testing applications at different levels


Download ppt "Online Music Store. MSE Project Presentation III"

Similar presentations


Ads by Google