Java Media Framework The Java Media Framework (JMF) is an application programming interface (API) for incorporating time- based media into Java applications.

Slides:



Advertisements
Similar presentations
Java Media Framework. Inhalt Java Media APIs Architektur Zeitmodell Manager Ereignismodell Datenmodell Control Controller Player Processor Plug-Ins Aufnahme.
Advertisements

Nicharee Srirochanakul
Cisco Confidential © 2012 Cisco and/or its affiliates. All rights reserved. 1 StadiumVision Mobile SDK Overview For App Developers Cisco Sports and Entertainment.
Chapter 11 Media and Interactivity Basics Key Concepts Copyright © 2013 Terry Ann Morris, Ed.D 1.
Using Multimedia on the Web Enhancing a Web Site with Sound, Video, and Applets.
CNIT 132 – Week 9 Multimedia. Working with Multimedia Bandwidth is a measure of the amount of data that can be sent through a communication pipeline each.
Chapter 11 Media and Interactivity Basics Key Concepts
It provides a framework for embedded multimedia facilities in java enabled mobile devices.
TANDBERG Content Server January Organizational Challenges Corporations have struggled in the past:  Achieving unified communications within a global.
HTML 5 and CSS 3, Illustrated Complete Unit K: Incorporating Video and Audio.
Multi-Model Digital Video Library Professor: Michael Lyu Member: Jacky Ma Joan Chung Multi-Model Digital Video Library LYU9904 Multi-Model Digital Video.
Multimedia Systems As Presented by: Craig Tomastik.
UNIT K: INCORPORATING VIDEO AND AUDIO 1 Encoding: the process of transforming moving image and/or sound into a digital file. Each encoding method known.
Lecture13 Java Media Framework II - Java Media Framework II - Processing and Controlling Media with JMF.
1 Frameworks. 2 Framework Set of cooperating classes/interfaces –Structure essential mechanisms of a problem domain –Programmer can extend framework classes,
1 L45 Multimedia: Applets and Applications. 2 OBJECTIVES  How to get and display images.  To create animations from sequences of images.  To create.
Quicktime Howell Istance School of Computing De Montfort University.
Lecture12 Java Media Framework I. Streaming Media Steaming media simply means we have a stream of media coming through some kind of a media channel. Some.
Lecture15 Java Media Framework IV. Processing Individual Frames The JMF’s BufferToImage and ImageToBuffer classes can be used to obtain frame images from.
Presentation Outline  Project Aims  Introduction of Digital Video Library  Introduction of Our Work  Considerations and Approach  Design and Implementation.
1 Introduction of JMF Student : 朱浩廷 Advisor : 杭學鳴 教授.
 Pearson Education, Inc. All rights reserved Multimedia: Applets and Applications.
Lecture14 Java Media Framework III – Some JMF Applications.
Define objects and their relationships to multimedia Explain the fundamentals of C, C++, Java, JavaScript, JScript, C#, ActiveX and VBScript Discuss security.
Outline of Presentation Introduction of digital video libraries Introduction of the CMU Informedia Project Informedia: user perspective Informedia:
1 Java Media Framework Multimedia Systems: Module 3 Lesson 1 Summary: r JMF Core Model m Architecture m Models: time, event, data r JMF Core Functionality.
A Web Services Based Streaming Gateway for Heterogeneous A/V Collaboration Hasan Bulut Computer Science Department Indiana University.
1 Video and Audio Over The Net Mahdi ZandakbariVesal Hajiabbas.
3dtv.at DV/HDV Tape Drive Synchronization Stereoscopic Displays and Applications Conference 29 th – 31 th January 2007 San Jose, United States.
Using Multimedia on the Web
© 2009 Research In Motion Limited Advanced Java Application Development for the BlackBerry Smartphone Trainer name Date.
Computer Science [3] Java Programming II - Laboratory Course Lab 7: Multimedia: Applets and Applications Faculty of Engineering & IT Software Engineering.
Copyright © 2012 Certification Partners, LLC -- All Rights Reserved Lesson 5: Multimedia on the Web.
FALL 2005CSI 4118 – UNIVERSITY OF OTTAWA1 Part 4 Web technologies: HTTP, CGI, PHP,Java applets)
Aurora: A Conceptual Model for Web-content Adaptation to Support the Universal Accessibility of Web-based Services Anita W. Huang, Neel Sundaresan Presented.
Sem 1 v2 Chapter 14: Layer 6 - The Presentation layer.
Silberschatz, Galvin and Gagne ©2009 Operating System Concepts – 8 th Edition, Chapter 19/20: Real-time and Multimedia Systems Inclusions from Tanenbaum,
Building Scalable and High Efficient Java Multimedia Collaboration Wenjun Wu, Tao Huang, Geoffrey Fox Community Grids Computing Laboratory, Indiana University,
Institute of Technology Sligo - Dept of Computing Sem 1 Chapter 14: Layer 6 - The Presentation layer.
1 Lecture 12: Multimedia Not in Web 101 Text  Important Multimedia Issues  Audio  Movies and Video  Multimedia and HTML Documents.
Video Conferencing-introduction --- IT Acumens. COM --- IT Acumens. COMIT Acumens. COMIT Acumens. COM.
© 2011 The McGraw-Hill Companies, Inc. All rights reserved Chapter 6: Video.
 2005 Pearson Education, Inc. All rights reserved Multimedia: Applets and Applications.
The Java Media Framework: javax.media
XP Tutorial 8New Perspectives on HTML and XHTML, Comprehensive 1 Using Multimedia on the Web Enhancing a Web Site with Sound, Video, and Applets Tutorial.
Creating Multimedia Interaction with Windows Media Technologies 7.
Chapter 17: Applets, Images, and Sound. Objectives Learn about applets Write an HTML document to host an applet Use the init() method Work with JApplet.
Copyright, 1996 © Dale Carnegie & Associates, Inc. Presented by Hsiuling Hsieh Christine Liu.
Minor Project By: Pasang Gurung Pramod Nepal Rajendra Bdr. Thapa.
JMF Introduction Yuqiang Liao Content What can JMF do What can JMF do Handling Time-Based Media Handling Time-Based Media JMF Architecture.
MULTIMEDIA INPUT / OUTPUT TECHNOLOGIES
03/11/2015 Michael Chai; Behrouz Forouzan Staffordshire University School of Computing Streaming 1.
Sound DirectMusic & DirectSound. DirectShow Video Formats DirectShow is an open architecture, it can support any format as long as there are filters to.
Server-side Programming The combination of –HTML –JavaScript –DOM is sometimes referred to as Dynamic HTML (DHTML) Web pages that include scripting are.
Celluloid An interactive media sequencing language.
1 Java Servlets l Servlets : programs that run within the context of a server, analogous to applets that run within the context of a browser. l Used to.
CSI 3125, Preliminaries, page 1 SERVLET. CSI 3125, Preliminaries, page 2 SERVLET A servlet is a server-side software program, written in Java code, that.
Multimedia Capture & storage. Introduction A rich set of API for recording of audio & video. A developer has two choices  launch the built-in app using.
2 If aliens came to this solar system and observed humans over the last several years, what would they think is the most significant benefits of the.
1 Java Server Pages A Java Server Page is a file consisting of HTML or XML markup into which special tags and code blocks are inserted When the page is.
Multimedia. Audio,vedio and Images End user typically refer to vedio/audio using the respective file format MP4 or AVI(audio vedio interleave) Developer.
Multimedia Retrieval Architecture Electrical Communication Engineering, Indian Institute of Science, Bangalore – , India Multimedia Retrieval Architecture.
Layer 6 Presentation Layer. Overview Now that you have learned about Layer 5 of the OSI model, it is time to look at Layer 6, the presentation layer.
California State University, LA Presented by Amanda Steven StevenAamirObaid.
Video 2 Subject:T0934 / Multimedia Programming Foundation Session:11 Tahun:2009 Versi:1/0.
Web Design, 5 th Edition 6 Multimedia and Interactivity Elements.
VIDEO.
Video Conferencing-introduction
Lesson 5: Multimedia on the Web
Network Controllable MP3 Player
Presentation transcript:

Java Media Framework The Java Media Framework (JMF) is an application programming interface (API) for incorporating time- based media into Java applications and applets  The JMF 1.0 API (the Java Media Player API) enabled programmers to develop Java programs that presented time-based media  JMF 2.0 API extends the framework to provide support for capturing and storing media data, controlling the type of processing that is performed during playback, performing custom processing on media data streams JMF 2.0 defines a plug-in API to allow developers to customize and extend JMF functionality

JMF Media Processing Model

Media Streams Often contain multiple channels  Tracks Example  MPEG-1 file usually 2 tracks Audio track Video track  Demultiplexing  Multiplexing

Example Process an mpeg-1 a/v media stream  Transcode video track to H.263  Transcode audio track to GSM Steps  Demultiplex to obtain tracks  Decompress video track  Compress using H.263  Decompress audio track  Compress using GSM  Multiplex two tracks  Save to file

JMF Design Goals Enable input, processing and output of time-based media Provides common cross platform API for accessing underlying media frameworks Extensible  support additional content types and formats, optimize handling of supported format, create new presentation mechanisms

Supported Content Types Supported types  Not always both decode and encode  Differences between platform independent and dependent versions  Audio WAV, GSM, MIDI, etc  Image JPEG, etc  Video H.261, H.263, MPEG-1, Quicktime, AVI, etc

Recording, processing, and presenting time-based media

High-level Architecture

Some JMF Base Interfaces Clock Controller Control

Time Model The Clock interface  Defines basic timing and synchronization operations  Contains a Timebase Based on the system clock time-base time  Simply provides current time  Clock marks time for a particular media stream media time  Current position within a media stream

Time Model

Clock Playback rate  How fast the Clock is running in relation to its TimeBase  Examples: rate of 1.0 represents normal running time rate of 2.0 means presentation will run at twice the normal rate  Clock Transform Media-time = media start-time+ Rate(time-base time – time-base start-time)

Example Example: Have a 20 sec MPEG video stream  MediaStartTime= 10 secs,  TimeBaseTime= 3 secs  TimeBaseStartTime= 0 secs, TimeBaseTime– TimeBaseStartTime= 3 secs  Media-time= media start-time+ Rate(time-base time – time-base start-time) So if Rate = 1.0, MediaTime= ?? Alternatively, if rate = -2.0, MediaTime= ??

Achieving Synchronization Example  Want to force a video renderer to sync to the timebase of an audio renderer  X = audio_renderer.getTimeBase()  Video_renderer.setTimeBase(X) Both objects would now use the same source of time.

Controller Interface  Defines basic state and control mechanism for an object that controls, presents or captures time- based media  Two types of Controller: Players and Processors (considered later…)

Controllers

Controller lifecycle

Controller Events JMF objects can post a MediaEvent Events posted by a Controller:  TransitionEvents Posted when a controller changes state  Change notification events e.g. RateChangeEvent  ControllerClosedEvents Posted when Controller shuts down  Corresponding listener interface for each type of JMF object that can post MediaEvents

JMF Event Model

Controls Mechanism for setting and querying attributes of an object Certain objects expose Controls  e.g. often used by PlugIns to provide access to their Control objects  Examples FrameRateControl GainControl  Can associate listener for when volume changes

Key objects in JMF Managers DataSources Players Processors DataSinks

General Managers Intermediary objects  Enables new implementations of key interfaces  4 main types Manager PlugInManager PackageManager CaptureDeviceManager

The Manager object Manager object used for instantiating:  DataSources, used to deliver time-based multimedia data  Players, used to control and render multimedia data  Processors, used to process data and output the processed data  DataSinks, takes a DataSource as input and renders the output to a specified destination

The Manager Object

Data Model in JMF Data Sources  Media players use DataSources to manage the transfer of media-content  DataSource encapsulates location of media and the protocol used to deliver the media  Typically Identified by a: URL, MediaLocator

Capture Capture devices represented as DataSources  e.g. microphone, video capture board, etc… Devices can deliver multiple data streams  e.g. audio and video from a camera  e.g. multiple audio tracks from a recording session  You may then wish a single DataSource to contain multiple SourceStreams  Manager.createMergingDataSource(SourceStreams)

Push and Pull Data Sources Data sources can be categorized according to how data transfer is initiated Pull Data-Source Client initiates the data transfer  e.g. HTTP and FILE Push Data-Source Server initiates the data transfer  e.g. broadcast and multicast media

Players Processes an input stream and renders it at a precise time Does not provide any control over the processing that it performs or how it renders the media data

Players Player extends the Controller interface.  Has a lifecycle  Sends media events Player as a MediaHandler  player = Manager.createPlayer(myDataSource); player = Manager.createPlayer(myMediaLocator); player = Manager.createPlayer(myUrl);

Players

UI Components Players provides access to UI Components  Player (or Processor) can provide two UI components Visual component Control-panel component  Can retrieve these components using methods: getVisualComponent() getControlPanelComponent()

Player States Continued Players post transitional events as they move between states  ControllerListener  Is the Player in an appropriate state?  Only certain methods make sense in certain states e.g. calling getTimeBase method on an unrealized player gives an error

Processors Can also be used to present media data Specialized type of Player that provides control over processing performed on the input media stream

Processing

Processor Stages

Additional Processor States Two additional stand-by states:  Configuring  Configured – can use TrackControls

Processing Controls For a given track, can control processing operations performed by the Processor by using the TrackControl for that track. TrackControl[] = processor.getTrackControls() Can explicitly select:  Effect, Codec and Renderer plug-ins to use TrackControl[1].setCodecChain( array_of_codecs )

Configuring the Processor Consider using a processor to transcode an (audio+video) QuickTime movie – changing mpeg video track to h.263… p = Manager.createProcessor(dataSource) p.configure() p.setContentDescriptor.QUICKTIME tcs[] = p.getTrackControls() Returns an array, e.g. 2 track controls…

Configuring the Processor Format f0 = new VideoFormat(VideoFormat.h263, new Dimension(width, height), Format.NOT_SPECIFIED, Format.byteArray, (float)frameRate); Format f1 = new AudioFormat(AudioFormat.mpeg, , 2); tcs[0].setFormat(f0) tcs[1].setFormat(f1) p.realize() p.start()

Processor Summary A Processor does not have to output data as a DataSource, such a processor (i.e. one that renders the data) is effectively a configurable player.

Data Storage and Transmission DataSink  Used to read data from a DataSource and render the media to an output destination  Typical actions… Write data to a file, across a network etc

Using the DataSink MediaLocator dest = new MediaLocator(file://newfile.wav); dsink = Manager.createDataSink(ds, dest); dsink.addDataSinkListener(this); dsink.open(); p.start(); dsink.start(); Wait for EndOfStream event Close DataSink and remove Listener  dsink.close()

Example Applet Movie Player

Simple Java Applet that demonstrates how to create a simple media player with a media event listener. It will play the media clip right away and continuously loop. <!-- Sample HTML -->

Basic Steps Initialisation…  Retrieve applet’s FILE parameter  Use this to locate media file and build URL  Create Player using the Manager object  Register applet as a ControllerListener

Steps 1 & 2: Resolving a URL for the media stream // The applet tag should contain the path to the // source media file, relative to the html page. if ((mediaFile = getParameter("FILE")) == null) Fatal("Invalid media file parameter"); try { // Create a url from the file name and the url // to the document containing this applet. if ((url = new URL(codeBase, mediaFile)) == null) Fatal("Can't build URL for " + mediaFile);

Step 3: Using Manager to Create a Player // Create an instance of a player for this media try { player = Manager.createPlayer(url); } catch (NoPlayerException e) { System.out.println(e); Fatal("Could not create player for " + url); }

Step 4: Register applet as a ControllerListener // Add ourselves as listener for player's events player.addControllerListener(this); } catch (MalformedURLException e) { Fatal("Invalid media file URL!"); } catch (IOException e) { Fatal("IO exception creating player for " + url);

Controlling the Player… Starting the Player public void start() { if (player != null) player.realize(); Stopping the Player public void stop() { if (player != null) { player.stop(); player.deallocate(); }

Responding to media events Need to Implement ControllerListener When the Player is realized Posts a RealizeCompleteEvent  Get the Visual component if (( visualComponent = player.getVisualComponent())!= null) { cPanel.add(visualComponent);  Get the Control Panel component When the media has reached the end…  Posts an EndOfMediaEvent  Rewind and start over player.setMediaTime(new Time(0));

Extensibility Can extend JMF functionality in two ways:  Using Plug-ins Effectively implementing custom processing components that can be interchanged with standard components used by a Processor  By direct implementation i.e implementing directly the Controller, Player, Processor, DataSource, or DataSink interfaces e.g. implementing a player to utilise a h/w MPEG decoder e.g. integrating existing media engines

Interface Plugin The base interface for JMF plug-ins. A PlugIn is a media processing unit that accepts data in a particular format and processes or presents the data.  Registered through the PlugInManager.  Methods… Open(), Close(), getName(), reset()  Sub-interfaces… Codec, effect, demultiplexer, multiplexer, etc.

Codecs One input and one output Methods…  setInputFormat()  setOutputFormat()  getSupportedInputFormats()  getSupportedOutputFormats()  process() input buffer, output buffer

Example 2 Accessing individual frames

FrameAccess Problem How to access individual decoded video frames from a Processor while processing the media. This could be used for scanning the decoded data; computing statistics for each video frame, etc.

FrameAccess Solution… use a ‘pass-through’ plugin codec as a callback when individual frames are being processed. Steps:  1) Build the pass-through codec.  2) Create a processor from the input file. This processor will be used as a player to playback the media.  3) Get the TrackControls from the processor.  4) Set your codec on the video track: TrackControl.setCodecChain(your_codec[])

Basic code // Get Video track as a track control TrackControlvideoTrack = null; for (int i = 0; i < tc.length; i++) { if (tc[i].getFormat() instanceof VideoFormat) { videoTrack = tc[i]; break; } // Instantiate & set frame access codec to data flow path. Codec codec[] = { new PreAccessCodec(), new PostAccessCodec() };

PreAccessCodec public class PreAccessCodec implements Codec { void accessFrame(Buffer frame) { long t = (long)(frame.getTimeStamp()/ f); System.err.println("Pre: frame #: " + frame.getSequenceNumber () + ", time: " + ((float)t)/100f + ", len: " + frame.getLength()); } ………………… Other methods, e.g. getSupportedInputFormats etc.

The Codec’s Process method public int process(Buffer in, Buffer out) { // This is the "Callback" to access individual frames. accessFrame(in); // Swap the data between the input & output. Object data = in.getData (); in.setData(out.getData ()); out.setData(data ); // Copy the input attributes to the output out.setFormat(in.getFormat()); out.setLength(in.getLength()); out.setOffset(in.getOffset());