Create applications that stream video and audio to an unlimited number of clients across platforms and browsers and to any and all devices, whether they be wire line, wireless, mobile, TV, PC, or other digital devices.
There are two major methods of delivering streaming video and audio content over the Web. The first method uses a standard Web server to deliver the video, audio, and data to a media player. The second method uses a separate streaming media server specialized to the audio/video/data streaming task. Deploying streaming media content with the Web server approach is actually only a small evolutionary step away from the download-and-play model.
Although Web server streaming can be an effective interim solution, a streaming server is more efficient and flexible and provides a better user experience. With the second method, streaming media files begin playing almost immediately, while the information is being sent, without having to wait for the whole file to download. And, viewers are immediately able to jump to any part of the video regardless of the length of the video or whether it has all been downloaded yet.
This article will focus primarily on the use of a separate streaming media server, Flash Media Server 3, which is available in three editions:
- Flash Media Interactive Server: The full-featured edition of the server.
- Flash Media Development Server: A development version of Flash Media Interactive Server. Supports all the same features but limits the number of connections.
- Flash Media Streaming Server: Supports the live-video and video-on-demand (recorded) streaming services only. This server edition does not support server-side scripting or stream recording.
Adobe Flash Media Streaming Server 3, focused on one-way streaming, is a scalable, real-time media server that delivers high quality (up to HD level), on-demand and live video and audio, regardless of the platform. It communicates and streams to Flash Player, Adobe AIR, mobile 'phones with Flash Lite 3, and the new Adobe Media Player consistently across platforms and browsers.
In addition to delivering live and video on demand (vod) to clients, Flash Media Streaming Server 3 offers a client API that lets you develop custom solutions that use these two services. This server should not be confused with Flash Media Interactive Server and Flash Media Development Server, which offer both client APIs and server APIs, and which were the subjects of an earlier article, Part 1 of this series.
Flash Media Streaming Server enables you to create client applications that stream live video and audio to an unlimited number of clients. These streams could be time delimited, like a short event, or always on, like a television or radio station. Other clients could use recorded media—anything from short commercials, movie trailers, and music videos to television programs and full-length movies.
And, Flash Media Server 3 supports playback of a variety of stream formats, including Flash Video (FLV), MPEG-3 (MP3), and MPEG-4 (MP4).
Note: MPEG-4 Part 10 is H.264, which will be discussed in some detail below.
Click here for a larger image.
Figure 1: Streaming live or recorded media to clients using RTMP protocol
Live and vod Services
The live service is a publishing point on Flash Media Server. You can use a media encoder to capture, encode, and stream live video to the live service and play the video with a client or with the FLVPlayback component shown in Figure 2. You also can build your own application to capture video and your own client application to play the video. For example, a producer could use Flash Media Encoder to capture and encode live audio and video from a speech and publish it to the server. Users could view the speech in a Flash Player, AIR, or Flash Lite client that subscribes to the stream.
The vod (video on demand) service lets you stream recorded media without building an application or configuring the server. You can use the Flash CS3 and Flash 9 FLVPlayback components as clients. You just need to copy MP4, FLV, and MP3 files into the vod application's media folder to stream the media to clients.
The live and vod services are signed (approved) by Adobe. Flash Media Streaming Server only supports signed services—it cannot run other applications. Flash Media Interactive Server and Flash Media Development Server, in contrast, support these signed services as well as any other applications you create.
Flash Media Streaming Server 3 provides single-server support with no restriction on the amount of bandwidth streamed or number of connected users. However, to provide even more capacity, you will have to upgrade to the more-expensive Flash Media Interactive Server, which features an Origin/Edge architecture with virtually unlimited scaling potential. Part 3 of this series of articles will include a discussion of this architecture.
Click here for a larger image.
Figure 2: An FLVPlayback component on a Flash Professional CS3 stage
Video players stream recorded or live video from Flash Media Server to users. To deliver recorded video on demand, Adobe provides a video player called the FLVPlayback component; you also can use ActionScript to develop your own client video player. Adobe Flash Media Encoder lets you capture audio and video while streaming it to Flash Media Server. And, you also can use ActionScript to build a custom Flash Player application that captures audio and video.
The References section include several web videos and print tutorials on how to build custom applications with FLVPlayback and other components.
The following are examples of the type of content you might stream:
- Short video clips, such as commercials up to 30 seconds.
- Longer clips (often called "long tail" content) up to 30 minutes.
- Very long clips, such as recorded television shows or movies up to several hours long.
- Video playlists can play a list of streams in a sequence, whether live streams, recorded streams, or a mix. The playlist can be in a client-side script or, on Flash Media Interactive Server, in a server-side script.
Video with Advertising
A streaming video application can insert advertising at various points, such as a short commercial that plays before a recorded television show or live video. The advertisement is often streamed from one server and the content is streamed from another server or from a Content Delivery Network. A video-with-advertising application typically connects to the ad server, streams the ad, and then closes the connection to the ad server. It then connects to the content server, streams the content, and closes that connection, repeating this sequence each time video is streamed. Figures 3 and 4 illustrate this kind of application.
Figure 3: A short advertisement plays before a recorded or live video
Click here for a larger image.
Figure 4: An advertisement is streamed from one server and the content is streamed from another server
A new feature available in all editions of Flash Media Server 3, the multipoint publish feature gives flexibility and scalability to your streaming applications. Previously, if you were using a content delivery network (CDN)—for when you require massive capacity and high reliability for delivering digital media files such as video, music, games, software, and social media—to deliver your streaming content, you were unable to implement any custom server-side code or inject any data messages into the outbound stream. Now, with multipoint publishing, you can use your own Flash Media Server (or Flash Media Encoder) to control the feed to the CDN, which then broadcasts it to your clients (as shown in Figure 5).
Note: The free development edition can actually be used in commercial applications as this local live publishing point.
Multipoint publishing can be used to build large-scale live video applications or to inject metadata into a live stream. For example, you could create an Internet TV station and publish the stream to a Flash Media Development server, which would publish the stream to a larger Flash Media Interactive Server deployment, such as a CDN that pushes the stream to millions of users.
Click here for a larger image.
Figure 5: Multipoint publishing
Multipoint publishing allows clients to publish to servers with only one client-to-server connection. This feature enables you to build large-scale live broadcasting applications, even with servers or subscribers in different geographic locations.
Note: To use multipoint publishing, you need to write server-side code in a main.asc file.
One of the challenges in live video broadcast is the need for current stream metadata to be sent to viewers who are connecting midstream. Unlike an on-demand stream, where metadata can always be at the beginning of the stream and received when a user first subscribes, live streams can be subscribed to at any time. Therefore, these latecomers may never receive the live stream's metadata. Data keyframes eliminate this issue by sending metadata to new subscribers when they join the stream.
So, if you create live streams that you do not record, or you want to add additional information, you need to set the metadata yourself. The metadata you add is in the form of data keyframes. Each data keyframe can contain multiple data properties, such as a title, height, width, duration of the video, the date it was created, the creator's name, and so on. Now, any client connecting to the server receives the metadata when the live video is played.
Note: The Flash Video Exporter utility (version 1.1 or later) is a tool that embeds a video's duration, frame rate, and other information into the video file itself. Other video encoders embed different sets of metadata, or you can explicitly add your own metadata.
Adobe Flash Media Server applications have a client-server architecture. The client code is written in ActionScript and runs in Adobe Flash Player, Adobe AIR, or Adobe Flash Lite. The server code is written in Server-Side Action-Script, which is similar to ActionScript 1.0.
The server and the client communicate over a persistent connection using Real-Time Messaging Protocol (RTMP). RTMP is a reliable TCP/IP protocol for streaming and data services. In a typical scenario, a web server delivers the client over HTTP. The client creates a socket connection to Flash Media Server over RTMP. The connection allows data to stream between client and server in real time. This architecture is illustrated in Figure 6.
Click here for a larger image.
Figure 6: Flash Media Server 3 application
Real-Time Messaging Protocol
All server editions communicate with Flash Player, AIR, and Flash Lite over Real-Time Messaging Protocol. RTMP is optimized to deliver high-impact streams in real time. An RTMP connection can multiplex any number of streams. Each stream contains synchronized audio, video, and data channels. Remote method invocation and shared object messages are carried in a data-only stream.
There are five types of RTMP connections supported by Flash Media Server 3:
- RTMP: This is the standard, unencrypted Real-Time Messaging Protocol. The default port is 1935; if a port is not specified, the client attempts to connect to ports in the following order: 1935, 443, 80 (RTMP), 80 (RTMPT).
- RTMPT: This protocol is RTMP tunneled over HTTP; the RTMP data is encapsulated as valid HTTP. The default port is 80.
- RTMPS: This protocol is RTMP over SSL. SSL is a protocol for enabling secure communications over TCP/IP. (Flash Media Server provides native support for both incoming and outgoing SSL connections.) The default port is 443.
- RTMPE: This protocol is an encrypted version of RTMP. RTMPE is faster than SSL, does not require certificate management, and is enabled in the server's Adaptor.xml file. If you specify RTMPE without explicitly specifying a port, the Flash Player scans ports just like it does with RTMP, in the following order: 1935 (RTMPE), 443 (RTMPE), 80 (RTMPE), and 80 (RTMPTE).
- RTMPTE: This protocol is RTMPE with an encrypted tunneling connection. The default port is 80.
Note: RTMP should not be confused with RTSP. RTSP (Real Time Streaming Protocol) is another protocol for use in streaming media systems that allows a client to remotely control a streaming media server, issuing VCR-like commands such as "play" and "pause," and allowing time-based access to files on a server. The References section includes two links to additional information on RTSP.
H.264 Video and HE-AAC Audio Support
Adobe has moved beyond native Flash file formats to advanced standards like H.264 and HE-AAC. With Flash Media Server and Flash Player supporting H.264 video, there's no reason to use anything else. This is a move towards a more open, standards-based digital world, and I expect most online video will be using these formats by the end of 2008.
Flash Player 9 Update 3 and AIR support video and audio encoded in H.264 and HE-AAC from within MPEG-4 standard file formats, which stream high quality video at lower bit rates. Today, developers can leverage industry standard tools, including Adobe Premiere Pro and Adobe After Effects, to create and deliver compelling video content. All editions of Flash Media Server 3 can stream H.264 and HE-AAC content to Flash Player 9 Update 3 and AIR.
Note: The H.264 and HE-AAC standards provide support for up to 1080p HD. 1080p resolution—which equates to 1,920 x 1,080 pixels—is the latest HD Holy Grail. That's because 1080p monitors are theoretically capable of displaying every pixel of the highest-resolution HD broadcasts. On paper, they should offer more than twice the resolution of today's 1,280 x 720, or 720p, HDTVs. There is a growing selection of consumer televisions with support for both 1080p inputs and outputs.
To pair with the excellent visual power of H.264 encoded video, Flash Player 9 Update also supports HE-AAC audio, which is the higher quality successor to MP3. Advanced Audio Coding (AAC) is a high-efficiency (HE) and high-fidelity (HiFi), low-bandwidth audio codec that can be used with or without video.
The new Adobe Media Player, an AIR application, will also support H.264, HE-AAC, and encrypted video content using the new Flash Media Rights Management Server, which will be discussed in the next and final article of this three-part series.
H.264 is the same format used by Apple for their QuickTime videos and for BluRay DVDs. With H.264 now supported by Adobe as well, I wouldn't be surprised if one of the biggest changes in 2008/2009 is the death of proprietary video codecs.
Flash Media Server doesn't encode or decode audio and video information; it streams media that has already been encoded. To encode and stream media that has already been captured (in other words, media that is not live), use any codec that supports the version of Flash Player or AIR that you want to target.
To capture, encode, and stream live video, you can use either Flash Media Encoder or use ActionScript to build your own Flash Player or AIR client.
Factors Affecting Performance
This section explains how performance for Flash Media Server 3 is affected by different conditions. Measuring performance can be done by comparing the number of concurrent streams for a given CPU utilization. Knowing how many streams a server will support will help you understand how many servers you need to deploy.
The number of streams a server can deliver is dependent upon a number of conditions, including:
- Protocol: RTMP is the highest performing protocol, followed by RTMPE.
- Video bitrate (quality): The lower the bitrate of your video, the more concurrent streams can be delivered by your server.
- Platform: You can deliver more connections, as shown in Figure 7, with less CPU usage utilizing Red Hat Linux.
- Hardware: Hardware such as RAM, disk speed, CPU, and network speed will all influence the streaming capacity of Flash Media Server 3.
- Configuration: Flash Media Server 3 comes pre-configured for optimal streaming performance for most situations. Changing the configuration may improve your performance.
- Application complexity: If you deploy custom plug-ins or develop complex server-side application logic, your performance may increase or decrease. (Factors in Flash Media Interactive Server and Flash Media Development Server.)
- Usage: The way your users interact with your video will impact the server performance. Interactions could include connecting, disconnecting, seeking, or pausing.
The graphs in Figure 7 show the number of concurrent streams given different percent CPU utilization, protocol, and bitrate. Notice that with more percent CPU utilization, you can deliver more streams. The systems were limited to 1Gbps throughput (in other words, 1Gbps network adapter) and never reached 100% CPU. Higher data rates resulted in faster saturation; lower data rates used more CPU to deliver similar connections.
Click here for a larger image.
Figure 7: Measuring performance increases by comparing the number of concurrent streams for a given CPU utilization
The impact of RTMPE reduced the capacity by only 25% to 30% on average, given similar percent CPU usage. If you are deploying RTMPE, you can expect increased CPU usage, but you will still be able to saturate a 1Gbps network with less than 70% CPU.
Note: An understanding of these factors is essential when considering service level agreements (SLAs), one of the subjects of the next article, Part 3, in this series.
Note: I intentionally omitted stating the hardware and software platforms that were used for the measurements graphed in Figure 7. The point of Figure 7 is to illustrate concepts.
The Cisco Content Delivery System
Click here for a larger image.
Figure 8: Cisco Content Delivery System. Any Stream to Any Screen with Adobe Flash Streaming Enhancements to the Cisco CDS
Adobe has partnered with Cisco to embed Flash Media Server 3 in the Cisco Content Delivery System (CDS). Cisco's CDS better enables users to stream Flash video to any and all devices, whether it be wireline, wireless, mobile, TV, PC, or other digital devices.
Don't forget anything that encourages more video (especially high quality video) is good for Cisco, because it, high quality video, needs a huge amount of bandwidth.
The CDS platform provides additional benefits for the hosting and distribution of Flash Player-compatible video content and services, including content and service routing, dynamic hierarchical caching, load balancing, failover protection, IP multicast extensions, unified management tools, and end-to-end quality-of-service (QoS) support.
Attempts to display media on computers date back to the earliest days of computing, in the mid-20th century. However, little progress was made for several decades, due primarily to the high cost and limited capabilities of computer hardware.
During the late 1980s, consumer-grade computers became powerful enough to display various media. The primary technical issues with streaming were:
- Having enough CPU power and bus bandwidth to support the required data rates
- Creating low-latency interrupt paths in the OS to prevent buffer underrun
However, computer networks were still limited, and media was usually delivered over non-streaming channels, such as CD-ROMs.
The late 1990s saw:
- Greater network bandwidth, especially in the last mile
- Increased access to networks, especially the Internet
- Use of standard protocols and formats, such as TCP/IP, HTTP, and HTML
- Commercialization of the Internet
These advances in computer networking, combined with powerful home computers and modern operating systems, made streaming media practical and affordable for ordinary consumers. Stand-alone Internet radio devices are offering listeners a "no-computer" option for listening to audio streams.
In general, multimedia content is large, so media storage and transmission costs are still significant; to offset this somewhat, media is generally compressed for both storage and streaming.
Links to representative streaming video application such as
- Videos in clinical medicine
- Video and audio courses
- Video Streaming & Email Marketing
- Third-party Flash video streaming hosting
are included in the References section.
And, video can help you convey the strength and achievements of your organization and leadership team to customers, partners, vendors, investors, employees, and the media. In an era of transparency and short attention spans, online video provides a personal, emotional experience that connects the presenter to your audience for the length of your presentation. On a video platform, you also can integrate other multimedia such as Flash animation, images, and graphics to increase the comprehension and retention of your message.
Finally, research in streaming media is ongoing. Representative research can be found at the Journal of Multimedia, a publication appropriate for readers with a background in advanced mathematics.
Transmitting streaming video over a wireless communications channel is, in and of itself, a challenging proposition. Wireless communications, in general, are prone to inducing errors into the digital bit stream because of interference, weak signals, multipath fading, crowded air waves, and any number of other factors. Overcoming these conditions is challenging enough, but today's video compression techniques make matters worse. To fit a streaming video application into the wireless bandwidth that's available today, compression standards like MPEG-4 and H.264 have become a way of life. These and other compression techniques help to deliver the video bit stream, but they often work at cross-purposes to the quality of the video image displayed on the terminal device.
The problem becomes acute because the various compression techniques remove much of the redundancy from a typical video bit stream. The logic behind most video compression methods is that the currently displayed frame is the basis for the frames following it.
Wireless communications works fine until an error is encountered in the bit stream. Because of the predictive nature of compression techniques, any error in an image could be propagated through successive frames. This uncontrolled propagation of errors also can cause the video decoder to lose synchronization with the video bit stream, leading to a complete failure of the decoding process. For the viewer, that means a frozen image on the display screen.
Newer compression standards, such as MPEG-4, have taken this phenomenon into account and a number of techniques are now built into the compression standards to overcome part of the problem. These techniques, which are known as error resilience tools, allow for the detection, containment, and concealment of errors in a video bit stream.
Of course, everything comes with a price and the price of error concealment is processor cycles.
Further exacerbating this problem is the fact that errors in wireless communications typically occur in bursts. As a result, the processor cycles devoted to error concealment would rise and fall sharply, further straining the resources of the terminal device at those times when the bit stream is prone to extensive errors.
New multimedia applications will continue to strain the resources of wireless devices. Any headroom designed into a device platform today will be quickly absorbed by new multimedia applications tomorrow.
The obvious solution: Design your applications conservatively.
Benefits of streaming versus HTTP delivery
There are two widely employed methods for delivering video over the Internet using Adobe Flash Player:
- Progressive download (simple download of a file)
In both progressive and streaming delivery, the video content is external to the SWF file. To deploy video content to the web, the SWF file and the video file are uploaded to a server.
Keeping the video external and separate offers a number of benefits over the embedded video method, including:
- Easy to update: Accommodates dynamic content and it's relatively easy to add or change content independent of the video player and without the need to republish the SWF file.
- Small SWF file size: Your SWF file can remain very small for faster page loading, allowing the video to be delivered when the user requests it.
- Better performance Because the FLV and SWF files are separate, the user will have a better playback experience.
Note: Although this section focuses on the delivery of video files, the same methods can be used to deliver audio files. In other words, audio files also can be progressively downloaded or streamed.
Why streaming is better
Progressive download is a simple method of video delivery with very little control—it's basically a simple HTTP download call. Streaming is a method that allows the publisher to control every aspect of the video experience.
The advantages of streaming video from Flash Media Server are numerous:
- Fast start: Streaming video is the fastest way to start playing any video on the web.
- Advanced video control: Features such as bandwidth detection, quality-of-service monitoring, automatic thumbnail creation, server-side playlists, and more.
- Efficient use of network resources: Customers who pay for their video hosting or bandwidth by the number of bits that are transferred can reduce their costs by using streaming video, because only the bits that the client actually views are transferred.
- More secure, protected media delivery: Because the media data is not saved to the client's cache when streamed, viewers can't retrieve the video or audio file from their temporary Internet files folder. There are also additional security features in Flash Media Server 3 that prevent stream ripping and other risks to your file's security.
- Minimal use of client resources: Resources such as memory and disk space are significantly reduced with streaming, because the clients do not need to download and store the entire file.
- Tracking, reporting, and logging capabilities: Because progressive download is a simple download of a file, you can't easily log specific relevant statistics such as how long the video was viewed, if the user navigated forward, backward, or paused the video, how many times the viewer played the video, if the viewer left the web page before the video completed playing, and so on. Streaming enables you to easily capture this important data.
- Full seek and navigation: Users can immediately seek to any point in the video and have it start playing immediately from that point. This makes streaming a great solution for longer playing videos or applications such as video blogging, classroom lectures, and conference sessions, where you may want to jump into the video at a specific point rather than requiring the viewer to watch it from the beginning.
- Deep interactivity: The precise control found in streaming enables developers to create extensive interaction in their video applications. For example, the ability to switch camera angles, have one video spawn another video, or the ability to seamlessly switch to alternate endings, are all enabled by streaming.
- Live video: Streaming provides the ability to deliver live video and audio from any connected webcam or DV camera (camcorder), and even directly from some video cards, natively in Flash Player.
- Video capture and record: (Flash Media Interactive Server only) In addition to live streaming, Flash Media Server also gives you the ability to record video either in conjunction with the live stream (for example, archiving an event) or on its own (for example, video messaging).
- Multiuser capabilities: (Flash Media Interactive Server only) In addition to live one-to-many streaming, Flash Media Server also enables multiuser streaming of audio, video, and data for the creation of video communication applications, as discussed in an earlier article, Part 1 of this series
Although streaming may be perceived as being more difficult than progressive download, they're actually extremely similar—they both use the same components and the same ActionScript commands. Streaming just gives the developer more power to create rich, interactive video applications.
The only potential downside to streaming is that it requires special server software. Just as a robust data application would require you to install an application server in addition to your web server, robust media delivery applications require a streaming server in addition to the web server.
When to choose streaming
You can use streaming with the Flash Media Server in situations where you need to:
- Deliver long files (greater than 30 seconds) or high-bit rate files (greater than 100Kbps)
- Perform bandwidth detection, allowing you to deliver the best quality video for the available hardware
- Quality-of-service monitoring
- Real-time tracking
- Provide real-time data sharing and interactivity to your video experiences
- Stream live video and/or audio
- Record video and/or audio
- Serve more streams with less bandwidth
If your web site or blog relies heavily on video, audio, or real-time data sharing, you can give your user the best experience by using the features in a Media Server.
Flash Lite on hand-held devices
Microsoft has signed a license to use Flash Lite and Reader LE in future Windows Mobile handsets as plug-ins for Internet Explorer Mobile. So, future versions of Windows Mobile handsets will be able to use Adobe's Flash Lite player technology to boost their ability to handle sophisticated Web pages while Microsoft works on a mobile version of its Silverlight competitor to Flash.
Note: Silverlight is Microsoft's attempt to rein in Adobe's position in the web development market with Flash. Microsoft is fighting an uphill battle, though, in trying to get web developers to build sites using its technology as opposed to Adobe's.
At the same time, Apple is working with Adobe with a view to offering Flash support on the iPhone. Adobe needs the iPhone to extend itself fully into the mobile market.
Flash Lite is a stripped-down version of the ubiquitous Flash video player that allows mobile handsets to view web sites created with the Flash technology. Think of Flash Lite as a slightly older version of Flash; the most current version of Flash Lite can't properly display web sites created with the newest version of Flash, Flash 9, but it works with sites created using older versions of the technology.
Note: As smartphones become more and more common, people are starting to get fed up with the basic web surfing experience offered by many phones. They want something that looks more like a PC experience, with rich graphics and video. But, that's hard to duplicate on a device with a smaller screen, less memory, a slower processor, and battery life requirements.
Flash Lite architecture: http://www.adobe.com/products/flashlite/architecture/
H.264 FLVPlayback: https://admin.adobe.acrobat.com/_a295153/p66556927/
H.264 Intro: http://www.youtube.com/watch?v=Pp7MGzwFFJI
Harte, L. Introduction to MPEG, Althos (2006)
Richardson, E. H.264 and MPEG-4 Video Compression, Wiley (2003)
- Creating a custom UI using a FLVPlayback component with Flash CS3 Professional
- Using the Flash Video Encoder: http://www.adobe.com/designcenter/video_workshop/
About the Author
Marcia Gulesian is an IT strategist, hands-on practitioner, and advocate for business-driven architectures. She has served as software developer, project manager, CTO, and CIO. Marcia is author of well more than 100 feature articles on IT, its economics, and its management.