Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 De-Mystifying Digital Media by Daniel McEnnis. 2/39 Overview Analog Compared to Digital Metadata Online Buying Music Information Retrieval – Digital.

Similar presentations

Presentation on theme: "1 De-Mystifying Digital Media by Daniel McEnnis. 2/39 Overview Analog Compared to Digital Metadata Online Buying Music Information Retrieval – Digital."— Presentation transcript:

1 1 De-Mystifying Digital Media by Daniel McEnnis

2 2/39 Overview Analog Compared to Digital Metadata Online Buying Music Information Retrieval – Digital Signal Processing – Machine Learning Recommendation – Data Used – Technical Methods Fingerprinting Social Issues

3 3/39 Analog Recordings

4 4/39 Speakers

5 5/39 Digitizing Audio Place a grid over the signal Where the vertical lines cross are the sample points. The only valid values are at crosshairs. For example: For the first dot, Since the vertical line crosses ar 3.8, it is assigned 4 For the second, it is assigned 0, even though the value is 0.4 and the value before is descending from 3.0.

6 6/39 Digital Audio

7 7/39 Terms Frequency: number of complete cycles per second Sample Rate: number of samples in a given second Nyquist frequency: highest frequency a digital signal can represent Aliasing: bogus frequencies created by the digitization process. Bit depth – how many different potential values can an audio signal have at any given point.

8 8/39 Nyquist Frequency

9 9/39 Aliasing

10 10/39 Analog Movies Numerous nearly identical pictures. Sound is recorded along the edges.

11 11/39 Digital Movies RGB color for every dot Resolution of the video: number of dots Sample rate of the video: how quickly the frames are presented

12 12/39 Problems with digitization Is the sample-rate high enough to capture all the frequencies needed? Has the signal been filtered to prevent aliasing? Does the sample rate match the analysis routines to follow? Is the sample rate small enough to keep a manageable amount of data.

13 13/39 Metadata

14 14/39 CD Lookups

15 15/39 CD Lookup Problems International releases in different languages Special releases and extra tracks making extra entries for a single album Data entry is error prone and difficult to scale How does one enter multi-disc albums, multi-composer tracks.

16 16/39 CD Submission Gracenote CDDB User Submissions Label Uploads Editorial Content Calculated Metadata

17 17/39 Video Object Detection Segment a movie to describe it automatically Where are the scene changes? Shot changes? What objects are in the picture? Who is in the picture? What is the script of the movie? – Mining closed captioning

18 18/39 Edge Detection

19 19/39 Music Information Retrieval When the given metadata just isnt enough. Teaching a computer what properties of music are– usually by example Utilizes both Digital Signal Processing (DSP) and machine learning.

20 20/39 What is DSP? Analyzing the sequence of numbers that make up a signal for some kind of information or patterns. Example Zero Crossings How many times does the signal pass through zero in a given time frame? What is it used for? Very good at detecting noisy signals like heavy metal.

21 21/39 Zero Crossings

22 22/39 What is Machine Learning? Create a model of a problem then use this model to solve problems. Example KNN – Find the nearest k (say 3) number of examples and check what type they are. The class that has the most points nearest to the test point is assigned to the point.

23 23/39 KNN in action

24 24/39 Deciding Mood of Music Clients are asked to run an analysis program over a piece of music Gracenote collects the raw features from the client Gracenote calculates (using machine learning) the mood of the music The mood is stored in the database of metadata.

25 25/39 Organizing Media Sort by metadata – Find all new songs of style Pop with a Rowdy mood. Automated Playlists software – Apple Genius or Gracenote Discover

26 26/39 Buying of Music and Movies purchases of online digital music downloads increased by 29 percent since last year, accounting for 33 percent of all music tracks purchased in the U.S., according to market research firm The NPD Group. – ZDNet. 2009. CD sales drop, digital downloads on the research firm Purchases of movies totaled $12.3 billion in the first three quarters of 2011 of which 18% is Internet purchases. – 2011. Digital Entertainment Group. Third Quarter 2011 Home Entertainment Report.

27 27/39 Buying Online By Metadata iTunes – Digital Rights Management (DRM) – Download music – Search by metadata

28 28/39 Buying By Recommendation Netflix – Flat fee – Unlimited movies – Search by metadata – Recommendations – Streaming of movies only – no downloads

29 29/39 Recommendation Huge catalogs, but nothing to watch or listen to. Bugbear of online shopping Very hard problem – (Needle in a haystack) Pick best few items out of countless millions of choices. – (Serendipity) Pick gems that are less well known for true discovery.

30 30/39 Data for Recommendation Listened to or watched (LastFM) Liked or disliked (LastFM, Pandora) Levels of engagement (Shopping Sites) – Looked at on a web page – Clicked on a web page – Added to a wish list or shopping cart – Purchased the item – Wrote a review on the item

31 31/39 Technical Means for Recommendation Collaborative Filtering – User to User Collaborative Filtering – Item to Item Collaborative Filtering Machine Learning – Per-Item models Content Based Recommendation – Hand picked top ten lists

32 32/39 User to User Collaborative Filtering Compare a person with what others like them listened to or watched. – People like you listen to the Beatles. What does it mean to be like another person: – Number of common items / # number of items Find the K (say 5) closest people and recommend anything they listen to that the target person doesnt. accurate Usually too time consuming to calculate Hard to use all data available

33 33/39 Item to Item Collaborative Filtering People who liked Harry Potter liked the Lord of the Rings Algorithm – (Beforehand) for every item, create a list of numbers that describe how similar this item is to every other item. – Make a list for all items a person has or likes. – For every potential recommendation, add up the similarity scores for every item the person listens to. – Recommend all items that are not already liked and exceed some threshold Relatively fast under most circumstances. Less accurate than other methods Tends to be a compact model Hard to use all the data available Harry PotterThe DeepHobbitMatrix Harry Potter1. The Deep0. Hobbit0. Matrix0.

34 34/39 Machine Learning For every item build a model to predict whether or not a user will want that item or not. For KNN, classify every user as having this item or not by whether the user is a neighbor of enough people that like the item. Very expensive Most accurate of the prediction techniques Easy to incorporate extra data

35 35/39 Problems with Recommendation Popularity bias – Everybody likes the Beatles Cold start problem – If I dont know who you are I cant recommend for you – If I dont know much about an item, I cant recommend it Speed – Databases are massive, can the algorithm finish in time? Memory – Can we even run the algorithm, or does it take more memory than we have. Quality – The problem is very hard. Are the results good enough. Hard to measure – Just what does it mean to have a good recommendation.

36 36/39 Fingerprinting Choose a set of DSP features Create a key describing the features of each movie or song in the database Calculate the key of a song or movie to be identified Compare keys to see if any of the keys in the database match the unknown song.

37 37/39 Scan and Match Use fingerprinting to determine what the song or movie owned by a customer is. – Cover song versus original – Explicit versus edited versions – The song is sped up or slowed down (music sharing trick) – Different versions of the same song on different albums – Live versus recorded performances Give that customer access to a top quality copy of that song or movie across all devices that the customer has – (cell phone, tablet, laptop, computer) The file itself is never copied either from the source device (computer) or from the server. Instead the file is streamed (broadcast like a TV signal) to the device on demand.

38 38/39 Social Issues How important is owning ones media? How much control should a media company have over how you use your collection. – Control how, when, or where you listen to music or watch video. – Revoke access to media you have already paid for. – Ban reselling of media – Ban reselling of accounts – Bans on copying music or movies When should sharing be permitted (if ever)? – 30 second rule How much privacy should you expect?

39 39/39 Illegal File Sharing Napster Kazaa Bit Torrent Basic Features – Central authority provides info about file to be distributed – Each file split into many pieces – Every person is both server and client, downloading and uploading pieces

Download ppt "1 De-Mystifying Digital Media by Daniel McEnnis. 2/39 Overview Analog Compared to Digital Metadata Online Buying Music Information Retrieval – Digital."

Similar presentations

Ads by Google