Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lecture13 Java Media Framework II - Java Media Framework II - Processing and Controlling Media with JMF.

Similar presentations


Presentation on theme: "Lecture13 Java Media Framework II - Java Media Framework II - Processing and Controlling Media with JMF."— Presentation transcript:

1 Lecture13 Java Media Framework II - Java Media Framework II - Processing and Controlling Media with JMF

2 JMF Timing Model JMF employs a layered approach to its representation of time. At the low-level end of the time model are classes for representing time to nanosecond accuracy. At the high-level end of the model, JMF sees controllers as being in one of a number of states that are transitioned between under program control.

3 Low-Level Time The Time class specifies a single instant in time. The following are the two constructors of Time class. Time(double seconds) // Time in seconds Time (long nanoseconds) // Time in nanoseconds While Time objects cannot tick, classes that implement TimeBase interface provide a constantly ticking, unalterable source of time. In particular SystemTimeBase class implement the TimeBase interface.

4 The Clock interface A Clock contains a TimeBase that provides a source of time. The interfaces Controller, Player, and Processor are all sub interfaces of Clock. The time a Clock object keeps is known as the media time.

5 Media Time The time a Clock object keeps is known as the media time which can be determined as follows: mediaTime = mediaStartT ime + rate*(timeBaseTime − timeBaseStartTime) mediaTime : The media’s own position in time. For instance, if an audio clip was one minute in length, its media time would range between 0 and 60 seconds. mediaStartTime : The offset within the media from which play is started. rate : The rate of time passage for media. A rate of 1 represents normal forward passage, whereas a value of -5 would represent fast rewind. timeBaseTime : The time of the TimeBase object that Clock incorporates. timeBaseStartTime : The time of the TimeBase object at which the Clock is started and synchronized with TimeBase. For instance, the Clock might be started 3.2 seconds after the Clock was created.

6 The Clock interface The Clock interface could be used to do things such as Being able to start or stop the media at arbitrary locations. Control its rate (for example, fast forward or rewind). The Clock is in one of two possible states: Started or Stoped. The usual steps in starting a clock are: Stop the clock if it is currently started. Set the media (start) time of the clock. Set the rate of the clock. syncStart() the clock.

7 High Level Time The Controller interface extends the Clock interface. The Controller interface directly extends Clock in three areas: Extends the concept of Stopped into number of states concerning resource allocation, so that time-consuming process can be better tracked and controlled. Provides a event mechanism by which the states can be tracked. Provides a mechanism by which objects providing further control over the controller can be obtained.

8 States - States - The Controller interface The Controller interface subdivides the Stopped category of Clock interface into five states: Unrealized : A controller that has been created but hasn’t undertaken any resource gathering. At the end of this state, TransitionEvent is generated. Realizing : A transition state reflecting the fact that the Controller is gathering information about the resources needed for its task, as well as gathering resources themselves. At the end of this state, RealizeCompleteEvent is generated. Realized : A steady state reflecting a Controller that has gathered all the nonexclusive resources needed for the task. At the end of this state, TransitionEvent is generated. Prefetching : A transition state reflecting the fact that the Controller is gathering all resources needed for its task that weren’t obtained in the realizing state. Typically this means acquiring exclusive usage resources such as hardware. At the end of this state, PrefetchCompleteEvent is generated. Prefetched : The Controller has acquired all necessary resources, performed all pre-start-up processing, and is ready to be started.

9 Methods - Methods - The Controller The Controller provides realize(), and prefetch() asynchronous methods in addition to syncStart() from the Clock interface in order to bring the Controller to the respective states.

10 Player interface While Player interface extends the Controller interface, it adds more power and at the same time retain the same states as that of the Controller. Player relaxes some restrictions that a Controller imposes on what methods can be called on a Started Controller. As a convenience, Player provides a start() method that can be invoked before a Player is Prefetched. This method attempts to transition the Player to the Started state from whatever state it’s currently in. For example, if the Player is Unrealized, start implicitly calls realize(), and prefetch(), (of Controller interface) and syncStart() (of Clock interface). The appropriate TransitionEvents are posted as the Player moves through each state on its way to Started.

11 The Processor interface The Processor interface extends the Player interface. It is the most central class in many processing tasks. Processor extends the state transition cycle of a Controller by adding the Configuring and Configured states. A ConfigureCompleteEvent is posted when the Processor reaches Configured state. The purpose of these additional states is to further refine the realizing process. The realizing step is essentially split into two phases: Source information gathering - the input DataSource is queried and the input media stream is parsed to get the format information for the tracks in the stream. Construction - the internals of the Processor are constructed to handle the input media stream. Between these two steps, you can program the Processor to perform specific processing on its media stream. The states of a Processor are: Unrealized, Configuring, Configured, Realizing, Realized, Prefetching, Prefetched, and Started.

12 Methods - Processor interface The Processor has the configure() method to bring it to the Configured state in addition to realize(), prefetch(), syncStart() and start() methods inherited from the Controller, Clock, and Player interfaces. The objects that implement ControllerListener interface can receive events generated by a Controller object (or a Player object or a Processor object). The classes which implement ControllerListener interface must implement the following method: public synchronized void controllerUpdate(ControllerEvent e)

13 Processing Just like we created imaging chains we can create processing chains using the DataSource class, Processor and Datasink interfaces. These are central to most processing tasks. The following classes are useful in sourcing the media: MediaLocator : Specifies the location and protocol for the media. Closely related to the URL class. Needed to create a DataSource. DataSource : A manager for media transfer. Required to create Players, Processors, or DataSinks. A DataSource can be obtained from a Processor using getOutput() method. This enables chaining the processing tasks. The DataSink interface enable us to write the media into a file.

14 Playing media using a Processor Since Processor works at a lower level we have to do more than what we did using a Player thus having more added control. Before we start a processor we need to bring it to configured state.

15 Bringing a Processor to Configured State // … URL url = new URL(getCodeBase(),videoFile); MediaLocator mediaLocator = new MediaLocator(url); try { processor = Manager.createProcessor(mediaLocator); } catch (NoProcessorException npe) { System.out.println("No Processor Exception"); } catch (IOException ioe) { System.out.println("IO error creating player"); } processor.addControllerListener(new VideoControlListener()); processor.configure(); // …

16 Playing media using a Processor public class VideoControlListener implements ControllerListener { private Processor processor; public void controllerUpdate(ControllerEvent event) { processor = (Processor) event.getSourceController(); if (event instanceof ConfigureCompleteEvent) { // Refer next slide } else if (event instanceof RealizeCompleteEvent){ SwingUtilities.invokeLater(new AddComponentsThread()); } else if (event instanceof EndOfMediaEvent) { processor.setMediaTime(new Time(0)); processor.start(); }

17 Processing Processing ConfigureCompleteEvent processor.setContentDescriptor(null); TrackControl[] controls = processor.getTrackControls(); for (int i = 0; i < controls.length; i++) { if (controls[i].getFormat() instanceof VideoFormat) controls[i].setFormat(new VideoFormat(VideoFormat.CINEPAK)); else controls[i].setFormat(new AudioFormat(AudioFormat.LINEAR)); } processor.start();

18 class AddComponentsThread class AddComponentsThread implements Runnable { private Component controlPanel,visualComponent; public void run() { controlPanel = processor.getControlPanelComponent(); if (controlPanel != null) panel.add(controlPanel, BorderLayout.SOUTH); else System.out.println("Unable to create Control Panel"); visualComponent = processor.getVisualComponent(); if (visualComponent != null) panel.add(visualComponent,BorderLayout.CENTER); else System.out.println(" Unable to create Visual Component "); panel.validate(); }

19 Programming the Processor 1. Create a Processor through the Manager class 2. Bring the Processor to the Configured state using configure(). 3. Set the required ContentDescriptor on the Processor which specifies the content type of the media that will be produced. 4. Obtain the TrackControls of the Processor 5. For each TrackControl Set the required output Format If codecs/effects are to be used for the track Set the TrackControl’s codec chain If the track is to be rendered Set the TrackControl’s renderer 6. Start the Processor

20 Programming the Processor If the Processor is to render the media to the screen rather than outputting it, Processor’s setContentDescriptor() method should be passed null. A rather restricted form of a realized Processor can be created using the ProcessorModel class. But This method can cause run time failures in certain situations, in partucular when the the Processor output is to be casted as a PushBufferDataSource. We will talk about PushBufferDataSource soon.

21 using the ProcessorModel class //... URL url = new URL(getCodeBase(),videoFile); MediaLocator mediaLocator = new MediaLocator(url); Format[] formats = new Format[1]; formats[0] = new VideoFormat(VideoFormat.CINEPAK); ContentDescriptor outputType = null; ProcessorModel processorModel = new ProcessorModel(mediaLocator,formats,outputType); try { processor = Manager.createRealizedProcessor(processorModel); } catch (CannotRealizeException cre) { System.out.println("Cannot Realize Processor"); } catch (NoProcessorException npe) { System.out.println("No Processor Exception"); } //...


Download ppt "Lecture13 Java Media Framework II - Java Media Framework II - Processing and Controlling Media with JMF."

Similar presentations


Ads by Google