Click here to Skip to main content
15,885,952 members
Articles / Mobile Apps / iOS

Migration of a streaming service from RTMP to WebRTC

Rate me:
Please Sign up or sign in to vote.
4.46/5 (11 votes)
2 Aug 2016CPOL13 min read 32.9K   1   15   4
A Story of One Video Streaming Project

image

Interesting client

I sat in front of a monitor for a good hour already or may be even two. It all started with some Twitter link a colleague of mine kindly dropped to me in Skype. Then I occasionally opened some news site, then Facebook, and it took enough time for a couple of worthy news to come again...

Anyway, my back became numb and I decided to make a walk to give it a stretch. It wasn't seem as a good idea to fry myself in the summer heat - it was cool in the office with air conditioners silently doing their job. So, I limped as far as the nearest coffee machine.

Somewhere on the reception the bell rang. Few minutes later I saw Anna who accompanied a tall gentleman. I guessed his age at about fifty. He wore a grey short-brim hat on his slightly wrinkled head. And they were heading to me.

Having come up with the coffee machine that already poured cappuccino to my cup, the gentleman said on slightly broken English - Hello, I am about to WebRTC project. Mine name Hans Schreder, and he gave me his hand. I answered the handshake thinking what could bring this German here and invited the guest to my office.

Gathering requirements

H: We operate in streaming and Flex since 2000 for a large user audience. We use Adobe Flash Media Server (FMS) and would like to switch to WebRTC.

Me: Can you elaborate what goals you pursue by switching to the WebRTC server?

H: We need a conventional media server that accepts video streams from a user and transmits them to other users. We want a video chat.

Me: No problem, we can develop a solution based on one of WebRTC servers.

H: Adobe FMS works fine for us. We just would like to extend our user audience to WebRTC too, leaving FMS as is. It works well. Hans took out a tablet, pushed it to me and pointed to the following diagram:

image

H: Flex App is a doctor, Flex Apps are patients. The doctor uses web camera to consult several patients simultaneously. One of patients can ask for private consultation, and the doctor initiates a private appointment for that patient, one-on-one. Other patients cannot communicate with the doctor during that and cannot see him.

image

That's weird, I thought. What kind of a consulting can a doctor provide to multiple patients? One has an aching ear, the other bemoans tonsils. And then the one with tonsils clicks a button for private consultation. However, the principle was clear. Let's go to the technical side of the question.

 

Me: You mean the video connection between the doctor and the patient is bidirectional?

H: Not exactly. The patient always sees the doctor, but the doctor cannot see the patient. By default, patient's video is off.

Me: So, it is one-directional - from the doctor to the patient?

H: In many cases, yes. However, sometimes patients want to show their videos to the doctor. This is a rare case, though.

Me: I see. So, you want both the doctor and patients to use a WebRTC browser like Firefox or Google Chrome, as well as IE that works via the FMS. Is this correct?

H: Almost. All our doctors use a Flex app developed by us. Patients should also use the app, or WebRTC.

Me: Therefore, ideally, the application should look as follows? And I sketched a scheme.

image

H: That is right. It should work just like that. On one side there's a native Flex application, and on the other side there's a WebRTC browser. We are interested mostly in Android smartphones and iOS devices. As you know, Flash is supported by all major desktop browsers: IE, Chrome, Firefox, Safari, but Android and iOS lack it. We would like you make our service available for mobile users as well, and keep what's working on desktop computers, that is FMS.

Me: WebRTC works fine on Android browsers, but we've got a problem with iOS - due to platform limitations WebRTC doesn't work there. We won't be able to deliver a WebRTC video stream to iOS and we cannot stream the video from iOS browser web camera too.

H: Wait a minute, I know that Safari does not support WebRTC, but Google Chrome does.

Me: Yes, but not under iOS. On this platform Chrome stumbles to technical limitations of the platform and simply has no ways to deliver WebRTC video operation similar to desktop computers. That is, iOS browser is not an option here. We wouldn't you upload your own app to Apple App Store? Then, iOS users could simply install the app and use the pure WebRTC solution as in Google Chrome?

H: Unfortunately, we cannot submit our application to App Store for specific reasons. Besides, we would like to give our users (patients) a way to communicate with the doctor directly from the browser, without installing any apps on their iPhones or iPads. What options do we have then?

At this moment I thought about those "specific reasons" preventing them to publish the application in App Store. Indeed, it could be that medical consulting is a regulated sphere, so it might not be that easy to simply upload the app of such kind to App Store.

In fact, there were not so many options left. The best variant for them would be a native application with WebRTC support. It is known that iOS Safari does support HLS (Apple HTTP Live Streaming), but this option was declined due to supposed real-time conversation between the doctor and the patient. And HLS with its 20 seconds latency simply does not suit for live communication.

The only option left was websockets. Websockets is supported by almost all browsers. Basically, it is a TCP channel to deliver a video with low latency, about 3 seconds which is comparable to RTMP. That's much better than 20 seconds anyway. That's for delivery. And we also want to play this stream in to the 'video' HTML5 element and be done with that.

Me: It seems we have only one option then - websockets. And in this case patients won't be able to send their video to the server. Only one-directional delivery from the doctor to patient is possible. We could also try HLS, but with latency about 20 seconds you hardly find this option suitable.

H: OK. Did I understand correct, that we can play live streams from FMS directly in the iOS Safari browser? WebRTC or not, but still with low latency similar to what we got with RTMP?

Me: Absolutely. But we have to test this first. Let's appoint to say Monday and I will show you the demo.

H: I would like to see how FMS integrates with WebRTC and Websockets to be sure this will work on both iOS and Android. Is this possible?

Me: Yes, I think it is.

H: Thank you for your consultation. I will come on Monday, at 10 if you don't mind and we'll talk with the demo already working.

Me: Sure. By this time everything will be set up.

Looking for a solution

As seen from the talk, the requirements had changed. Now we had to plug to delivery methods to Adobe AMS: WebRTC for Android and Websockets for iOS Safari. Now we only needed the missing link that allowed us to build the demo and merge all involved protocols and technologies.

image

I bowed the German guest and started looking at Adobe AMS specification. There were many interesting things there except for words "WebRTC" and "Webscoket".

 

Then, I decided to simply Google for three keywords: rtmp, webrtc, websockets. Google returned with a range of relevant sites. There were only two worthy of them: a project Flashphoner and the description of an opensource prototype from Phoboslab

image

First candidate

I started with Phoboslab, where I found a technical description of the iOS Safari video stream playback problem and a solution that seemed opensource. The solution was based on ffmpeg, node.js and a client javascript for decoding and playing the video stream. All components were indeed opensource, so the scheme looked promising. I set up a virtual server on DO, built ffmpeg and installed node.js. This took as little as two hours in total.

image

Video indeed played in iOS Safari, and did it good. My iPhone5 became a bit warm, but JavaScript steadily processed video traffic from Websocket and displayed it on the web page.

As a matter of fact, JavaScript decoded the stream and drew it on the 'canvas' element of the page in the iOS Safari browser.

Although, the following question remained:

  • How to pick up a stream from FMS
  • How to add sound to the stream
  • What about WebRTC

And here came a little bit of frustration. Turned out, the JavaScript player played (drew) only video. For audio, we would need an additional stream and synchronize them somehow. But the solution wasn't designed for that. Therefore, this candidate did not meet the requirement. It wouldn't be able to transfer doctor's video due to the lack of sound.

Second candidate

The other subject was Web Call Server that claimed support for RTMP, WebRTC, Websocket protocols. So I just needed to test if this was applicable to my specific case and to see how it worked.

At first, I decided to test how an RTMP video stream converts to Websocket, just like I had done before with the first candidate. If I succeeded, we could simply redirect the RTMP stream from FMS to Web Call Server and solve one of the tasks then.

image

Armed with my iPhone, I opened one of the demo pages offering to test the operation from the demo server. According to the support, Web Call Server can be quickly installed on a Linux system, but it takes time nevertheless, and the demo is a quick way to see if it works at all or not. The demo interface was a simple Flash application called Flash Streaming with plain interface and very simple function.

image

From this Flash app one can connect to the server via the RTMP protocol and publish a webcam stream. Publish means capture a video stream from browser's web camera using Flash Player and send the data on the server in real time using the RTMP protocol.

According to Connected and Publishing statuses, connection was successful and webcam stream was properly sent to the server. To avoid flashing my own face in the stream, I used a virtual camera instead and a random Game of Throne series episode.

Now, we'd got to see and hear the video stream on iPhone in the Safari browser. For this, a separate player was recommended, called WS Player Minimal.

image

The player managed to show decent picture and sound without distortions or unsync. It seemed I was able to achieve some progress here:

  • I managed to test RTMP-Websocket stream delivery
  • The stream had sound and video and was correctly displayed in Safari

I had to test WebRTC playback of the stream and switch to Adobe FMS integration. To test the same stream with WebRTC, I opened another demo called Streamer and Player Minimal in Chrome and went through the same simple procedure: pasted the stream name and clicked Play.

image

Boy was I glad, Khaleesi!

Now I had RTMP stream delivery to Chrome in my arsenal, and that meant also a way to deliver to Android via WebRTC, and to iOS Safari via websockets. In both cases the picture was smooth with sound and everything, and was pretty much suitable for consulting services.

The next point was to deal with FMS. Surely, the RTMP protocol should be the same for all implementations, but I had to find out a) Could FMS retranslate the RTMP stream to Flashphoner and b) whether this Flashphoner accepts this stream just like Flash did in the tests above.

Adobe Media Server integration

It took an effort to deal with FMS. Installing and testing it took good few hours. First I did was to test FMS with FML? and made sure I installed and configured FMS correctly, and that RTMP video streams run through without any obstacles.

image

The next step was to configure redirection of the RTMP stream to Flashphoner. Here I had to use my head a lot. Armed with Adobe Action Script docs I finally implemented the following script: main.asc

JavaScript
var wcsServer = "wcs5-eu.flashphoner.com";
var netConnections = new Object();
var streams = new Object();

application.onConnect = function (client){
    trace("onConnect "+client.id);
    var nc = new NetConnection();
    var obj = new Object();
    obj.login = "Alice";
    obj.appKey  = "flashChatApp";
    nc.connect("rtmp://"+wcsServer+":1935",obj);
    nc.onStatus = function(info){
        trace("onStatus info.code: "+info.code);
        if (info.code=="NetConnection.Connect.Success"){
            trace("connection opened "+wcsServer);
        }
    }
    netConnections[client.id]=nc;
    return true;
}

application.onDisconnect = function (client){
    trace("onDisconnect "+client.id);
    var nc = netConnections[client.id];
    if (nc){
        nc.close();
        trace("disconnected "+client.id);
    }
}

application.onPublish = function(client, myStream){
    trace("onPublish "+myStream.name);
    var nc = netConnections[client.id];
    ns = new NetStream(nc);
    ns.onStatus = function(info){
        if (info.code == "NetStream.Publish.Start"){
            trace("It is now publishing "+myStream.name);
        }
    }
    ns.attach(myStream);
    ns.publish(myStream.name);
    streams[myStream.name]=ns;
    trace("publish stream "+myStream.name+" to: "+wcsServer);
}

application.onUnpublish = function(client, myStream){
    trace("onUnpublish "+myStream.name);
    var ns = streams[myStream.name];
    if (ns){
        ns.publish(false);
        trace("unpublished "+ns.name);
    }
}

The script is simple. All it does is delegates all incoming FMS connections and video streams to the Flashphoner server. For example, if a connection comes from an application in the onConnect method, a connection to the Flashphoner via RTMP is established. If a video stream is received with onPublish, the same video stream is published on Flashphoner. On any disconnects or stopped streams, the corresponding calls are delegated to free used resources.

So I made a bridge between FMS and Flashphoner to channel traffic and further cast it to WebRTC and Websockets.

image

To test this composition, I took the Flash Streaming interface for Flash I worked with before. The only difference now was to specify the RTMP address of the FMS server and rely on the main.asc script that should delegate this video streamto Flashphoner. In my case the address was rtmp://my-fms:1935

image

Due to my lack of knowledge of Action Script and pitiful FMS programming skills I had gone through nine hells of debugging while making this script, but past is past, and the code sample above is the final working version of the script. FMS did well and passed the RTMP stream to the addressee as expected, which allowed me to play it successfully in Chrome and Safari.

image

Installing Web Call Server

So the demo was ready. The only thing left was to install Web Call Server to my own system to prevent any faults during demonstration. Who knows what they will change until Monday? The developer's site offered installation instructions of just 5 steps. I omitted the fifth step - installing with SSL certificates, because I didn't plan to use WebRTC streaming from a camera and microphone.

  1. Download Web Call Server 5
  2. Install using the 'install.sh' script
  3. Launch using 'service webcallserver start'
  4. Open the web-interface http://host:9091 and activate your license

Installation was successful. I preliminarily disabled the Firewall on the test server with "service iptables stop" to avoid any possible traffic lock issues. In a minute after installing the server, I managed to open the web interface with the admin panel http://host:9091, activate the test license and end up with the demo server on my Ubuntu that looked very similar to this:

image

Another iteration of tests reassured me that the system works perfectly in my environment too. With a sense of accomplishment, I finished the work and make a note to myself to run tests one more time on Monday at 9:00, right before Hans will come. That's all for the first part. I will describe migration issues and how it finally turned out in the second part, if anyone interested.

Used tools:

  1. FMS (Flash Media Server) aka AMS (Adobe Media Server) ? RTMP media server.
  2. DO (Digital Ocean) ? virtual server hosting.
  3. WCS (Flashphoner Web Call Server) ? WebRTC, Websocket media server.
  4. FMLE (Adobe Flash Media Live Encoder) ? a client to check RTMP connections to the server.
  5. Phoboslab - an opensource prototype of Websocket streaming to iOS Safari.

 

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


Written By
United Kingdom United Kingdom
This member has not yet provided a Biography. Assume it's interesting and varied, and probably something to do with programming.

Comments and Discussions

 
AnswerIs this really a progress? Pin
Member 1056193617-Oct-17 0:00
Member 1056193617-Oct-17 0:00 
QuestionAny information post POC ? Pin
Member 1259711321-Jun-16 20:15
Member 1259711321-Jun-16 20:15 
QuestionFlash phoner site details Pin
Bhuvanesh Mohankumar16-Jun-16 9:32
Bhuvanesh Mohankumar16-Jun-16 9:32 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.