Video

The HP webOS SDK offers two options for playing video:

  • The webOS Video Player. If you want to let the user play a piece of video content and don't have specialized user interface requirements, you can launch the video player built into webOS. The player launches in its own card and provides basic playback controls. When finished waching the video, the user can toss the card away and return to your application.

  • HTML 5 Video objects. If you want to play video that is integral to the function of your app (e.g., a "cut scene" in a game, or video help content), or if you want to provide an integrated video playback experience with a custom user interface, your app can play video using HTML 5 Video objects.

Regardless of which option you choose, you can play video from a local file on the user's device, or from a remote server. See Supported Video Formats, below, for details on the protocols and formats webOS supports.

On this page:

  • Using the webOS Video Player
  • Using HTML 5 Video Objects
    • How Video Objects Work in webOS
    • Adding a Video Object
    • Loading the MediaExtension Library
    • Setting an Object's Audio Class
    • Controlling a Media Object
      • Playing, Pausing and Seeking
      • Determining Whether a Media Object is Pausable
      • Handling Media Events
    • Setting a Video Object's Fit Mode
  • Supported Video Formats

Using the webOS Video Player

To initiate playback of a video in the webOS Video Player app, you launch the app using the Application Manager service, providing the URL (which may be local or remote) of the file or stream you want to play.

Application Manager provides two different methods for launching another application. The open method takes just a URL and uses the MIME type and/or filename extension to determine which application to launch. You may wish to use this method if you're not sure what type of resource (audio or video, for example) your URL represents and you want webOS to choose the right app for playback:

this.controller.serviceRequest("palm://com.palm.applicationManager",
  {
      method: "open",
      parameters: {
          target: "http://mydomain.com/media/myfile.m4v"
      }
  }
); 

The launch method allows you to specify which application to use. As long as you're sure your URL represents a video resource, this method gives you more control and should result in a faster launch:

this.controller.serviceRequest("palm://com.palm.applicationManager",
  {
      method: "launch",
      parameters: {
          id: "com.palm.app.videoplayer",
          params: {
              target: "http://mydomain.com/media/myfile.m4v"
          }
      }
  }
); 

For details on the use of Application Manager, see the Application Manager reference.

Note:

Video and audio playback is not currently supported in the emulator.

Using HTML 5 Video Objects

Important:

webOS 1.4 introduced a change in implementation of the HTML 5 Media API. Developers beginning work on new apps should use the updated API, described on this page. Developers with applications already in the App Catalog should transition to the updated API within the coming months. For more information, see Transitioning to the Updated Media API.

For playing video within an app, the webOS SDK supports the proposed HTML 5 Media specification. To play video, you can create a Video object, call the methods exposed by that object to control playback, and handle the events generated by that object as needed.

This document does not cover use of the HTML 5 Video object in detail, but the HTML 5 Media spec does provide exhaustive detail, and many useful tutorials and walkthroughs may be found online.

How Video Objects Work in webOS

webOS's support for the Video object is currently designed to display video in full-screen mode only. This has several implications that you should understand:

  • Regardless of the layout properties (size and position) you specify for a Video object, the video always plays on a "virtual plane" that is scaled to fit the full size of the display and oriented for viewing with the device rotated to the left (90 degrees counterclockwise from upright).

  • A Video object's layout properties do have an impact, however: they determine the size and position of the viewport in which the video will be visible. This viewport effectively crops the video, exposing only a portion of it for viewing. What appears in the area surrounding the viewport is determined by your scene's layout.

  • In practice, this means you will always want to size your Video objects to use the full display area (unless you are playing a video that has been designed specifically to be cropped).

  • webOS does not currently support the controls attribute for Video objects, which means that if your app requires playback controls, you'll need to provide them yourself.

  • If you do provide playback controls or other UI elements to be shown with the video, you should overlay them above the video. For the best user experience, you should make your controls transparent and/or show and hide them dynamically. You can learn from (and reuse) the source code of the webOS Video Player app, which you'll find installed with the SDK at <sdk-root>/share/refcode/applications.

  • Likewise, if you provide controls, you should set your stage's orientation to "left" when displaying the scene so that your controls are oriented the same as the video itself.

Adding a Video Object

In contrast to the Audio object, there is no JavaScript constructor for creating a Video object. So, to add a Video object, simply include the <video> element in your HTML markup for the scene in which you want the video to appear, as shown in the example below.

<video src="http://mydomain.com/media/myfile.mp4" id="myVideoElement"
  width="100%" height="100%"></video> 

Note that we have set both the width and height of the Video object to 100%, as discussed above in How Video Objects Work in webOS. We have set these properties directly in the markup, but it could alternatively be done via CSS.

Loading the MediaExtension Library

webOS adheres very closely to the HTML 5 Media spec, but there are some cases (described below) in which it's necessary to augment the specified functionality. For these cases, webOS provides the MediaExtension library.

Currently, cases that require use of the MediaExtension library include:

  • Setting an Audio or Video object's audio class
  • Determining whether an Audio or Video object is pausable
  • Setting a Video object's fit mode

To use the MediaExtension library, you first need to include MojoLoader in your application by adding the following line within the <head> section of your app's index.html file:

<script src="/usr/palm/frameworks/mojoloader.js" type="text/javascript"></script> 

Then, generally within a scene assistant, you load the MediaExtension library and use it to instantiate a MediaExtension object for any Audio or Video object that may require one:

// Load the MediaExtension library
this.libs = MojoLoader.require({ name: "mediaextension", version: "1.0"});

// If you don't already have one, get a reference to the media element, using its ID
this.mediaObj = this.controller.get("myMediaElement");

// Instantiate the MediaExtension object
this.extObj = this.libs.mediaextension.MediaExtension.getInstance(this.mediaObj);

Setting an Object's Audio Class

To ensure a good user experience, webOS automatically manages some aspects of audio and video playback. For example, when a webOS device receives a phone call or triggers an audible alert, it automatically mutes, reduces the volume of, or pauses any other audio or video that may be playing.

In order to do the right thing, webOS needs to know something about the nature of each media object that your application plays. You can provide webOS with the information it needs by setting an object's audio class.

webOS supports a number of audio classes at the system level, but for applications there are only two classes that commonly apply: audio and video content should be assigned the media class, while application sounds (e.g., sound effects, UI cues, etc.) should be assigned the defaultapp class.

webOS assumes the defaultapp audio class unless you specify otherwise, so in practice you generally only need to set the audio class for objects that should be classified as media.

An Audio or Video object whose audio class is set to media behaves as follows:

  • An object that is playing will automatically pause when a phone call begins or when another object of the media class begins playing (generally within another app).

  • An Audio object that has been paused for a phone call will automatically resume playing when the call ends, but a Video object will not.

  • Neither an Audio nor a Video object will automatically resume after being paused to accommodate the playback of another piece of media.

Note:

When webOS pauses or resumes playing a particular object, the object will fire a pause or play event, per the HTML 5 Media specification. Your application should listen for these events and update any playback controls your UI may provide (and perform other operations as needed). See Handling Media Events, below.

To set an object's audio class, you first need to load the MediaExtension library and obtain a MediaExtension object, as described above. Once you have obtained a MediaExtension object, it's simple to set the audio class:

// this.extObj is a MediaExtension object associated with
// the Media object whose audio class we want to set
this.extObj.audioClass = "media";

Controlling a Media Object

A detailed discussion of controlling HTML 5 Media objects is beyond the scope of this document, but this section provides a high-level introduction to controlling video objects within a webOS application. For more information, please refer to the HTML 5 Media specification or reference and tutorial resources available from other sources.

Playing, Pausing and Seeking

Per the HTML 5 Media spec, media objects expose play() and pause() methods, and a currentTime property for seeking to a particular point in the audio. You can use these (along with the object's other methods and properties) to control playback.

The following example illustrates how you might control media playback in response to button taps in your UI (assuming you are listening for taps on each button and have registered the following methods as handlers):

MySceneAssistant.prototype.handlePlayButtonTap = function() {
  this.mediaObj.play();
}

MySceneAssistant.prototype.handlePauseButtonTap = function() {
  this.mediaObj.pause();
}

MySceneAssistant.prototype.handleRewindButtonTap = function() {
  this.mediaObj.currentTime = 0.0;
} 

Determining Whether a Media Object is Pausable

Some media streams that use the RTSP protocol are not pausable. You may need to adjust your application's playback UI or logic in this case, so webOS provides a mechanism for checking to see whether an object is pausable.

You first need to load the MediaExtension library and obtain a MediaExtension object, as described above. Once you have obtained a MediaExtension object, you can check for pausability as shown here:

// this.extObj is a MediaExtension object associated with
// the Media object whose pausability we want to check
if (this.extObj.pausable) {
  // Adjust UI and app logic accordingly...
} 

Handling Media Events

HTML 5 Media objects fire a variety of events to indicate state changes. You can listen for these events and respond as appropriate within your application.

As noted above, under certain circumstances webOS may automatically pause and resume playback of objects whose audio class you have set to media. When this occurs, the affected object will fire a pause or play event, which you should use to trigger UI updates and any other operations your app may need to perform.

The following example illustrates how to listen for and respond to these events:

MySceneAssistant.prototype.setup = function() {
  // Load the MediaExtension library, required to set audio class
  this.libs = MojoLoader.require({ name: "mediaextension", version: "1.0"});

  // Get a reference to the media element, using its ID
  this.mediaObj = this.controller.get("myMediaElement");

  // Get the MediaExtension object and set the audio class
  this.extObj = this.libs.mediaextension.MediaExtension.getInstance(this.mediaObj);
  this.extObj.audioClass = "media";

  // Listen for pause and play events
  this.mediaObj.addEventListener("pause", this.handlePause.bind(this), true);
  this.mediaObj.addEventListener("play", this.handlePlay.bind(this), true);
}

MySceneAssistant.prototype.handlePause = function(evt) {
  Mojo.Log.info("received pause event");
  // Update UI, etc.
}

MySceneAssistant.prototype.handlePlay = function(evt) {
  Mojo.Log.info("received play event");
  // Update UI, etc.
} 

Setting a Video Object's Fit Mode

As described previously, a video in webOS is automatically scaled to the full size of the display. If the aspect ratio of the video is not the same as the aspect ratio of the display, the video's fit mode determines exactly how the video is scaled. The possible values for fit mode are:

  • fill: The video is scaled so that it fills the display in both dimensions. Some cropping may occur in one dimension. This is the default value.

  • fit: The video is scaled so that it fills the display in one dimension. Some padding may be required in the other dimension.

To set an object's fit mode, you first need to load the MediaExtension library and obtain a MediaExtension object, as described above. Once you have obtained a MediaExtension object, you can set the fit mode as follows:

// this.extObj is a MediaExtension object associated with
// the Media object whose audio class we want to set
this.extObj.setFitMode("fit");

Supported Video Formats

This section lists the supported video formats, encoding profiles, containers, and streaming protocols.

Supported Formats

The following are the supported video formats:

  • H.264 Decoder Baseline Profile at Level 3. Maximum image resolution: VGA (640 x 480 pixels) Maximum frame rate: 30 fps Maximum bit rate: 1.5 Mbps

  • MPEG4/H.263 Decoder MPEG4 Visual Simple Profile at level 5: Maximum image resolution: VGA (640 x 480 pixels) Maximum frame rate: 30 fps Maximum bit rate: 1.5 Mbps

Encoding Profiles

We recommend the following encoding profiles for local playback:

  • Content with aspect ratio 4:3 (for example, full-screen standard-definition TV):

    H.264 480x360, 1.5 Mbps, 30 fps; AAC 44 KHz stereo, 160 Kbps

  • Content with aspect ratio 16:9 (for example, widescreen film and high-definition TV):

    H.264 480x270, 1.5 Mbps, 30 fps; AAC 44 KHz stereo, 160 Kbps

Containers

The supported audio/video containers are MP4, M4A, M4V, MOV, 3GP, and 3G2.

Streaming

The following table lists the recommended encoding settings for streaming.

Note: The Pixi device does not include Wi-Fi. Therefore, use only the low bandwidth recommendations.

Protocol Bandwidth Recommended Supported
HTTP progressive download High 512 Kbps, H.264 Baseline Profile, 480 x 320 pixels, 30 fps.
64 Kbps, AAC+, 44 KHz, stereo.
All local formats are supported.
Low 128 Kbps, H.264 Baseline Profile, 320 x 240 pixels, 20 fps.
24 Kbps, eAAC+, 44 KHz, stereo.
All local formats are supported.
Real time streaming protocol (RTSP) Low 128 Kbps, H.264 Baseline Profile, 320 x 240 pixels, 20 fps.
24 Kbps, eAAC+, 44 KHz, stereo.
Video: H.264, MPEG-4, and H.263
Audio: AAC and AMR