天天看點

在 Java 中使用 WebRTC 傳輸視訊——使用 Native API

引言

上篇文章中,我們已經将一些準備工作處理完了,是以這篇文章,我就來分享一下我是怎麼在Java中使用WebRTC Native API的。其他在 Java 中使用 WebRTC 的經驗均收錄于

<在 Java 中使用 WebRTC>

中,對這個方向感興趣的同學可以翻閱一下。本文源代碼可通過掃描文章下方的公衆号擷取或

付費下載下傳

使用Native APIs

建立PeerConnectionFactory

之前介紹Native APIs的時候就提過,WebRTC有三個主要線程來處理各項事務,這裡我們先通過API來建立相應的線程,順便一提說這個WebRTC提供的線程庫真的很強大,你甚至可以把它作為一個跨平台的線程庫來時候。如果有機會,我以後會專門寫一篇文章介紹它的實作。書歸正傳,在建立線程的時候有一個重點的點就是建立NetworkThread時需要使用CreateWithSocketServer方法。

void RTC::InitThreads() {
       signaling_thread = rtc::Thread::Create();
       signaling_thread->SetName("signaling", nullptr);
       RTC_CHECK(signaling_thread->Start()) << "Failed to start thread";
       WEBRTC_LOG("Original socket server used.", INFO);
       worker_thread = rtc::Thread::Create();
       worker_thread->SetName("worker", nullptr);
       RTC_CHECK(worker_thread->Start()) << "Failed to start thread";
       network_thread = rtc::Thread::CreateWithSocketServer();
       network_thread->SetName("network", nullptr);
       RTC_CHECK(network_thread->Start()) << "Failed to start thread";
   }           

此外如果您像我一樣,有特殊的音頻采集需求的話,就需要自己實作一個自己的AudioDeviceModule,這裡有一個注意的内容是建立AudioDeviceModule的過程必須在工作線程中進行,而且我們也需要在工作線程中釋放該對象。

void RTC::Init(jobject audio_capturer, jobject video_capturer) { //初始化PeerConnectionFactory過程
       this->video_capturer = video_capturer;
       InitThreads(); //初始化線程
       audio_device_module = worker_thread->Invoke<rtc::scoped_refptr<webrtc::AudioDeviceModule>>(
               RTC_FROM_HERE,
               rtc::Bind(
                       &RTC::InitJavaAudioDeviceModule,
                       this,
                       audio_capturer)); //在工作線程中初始化AudioDeviceModule
       WEBRTC_LOG("After fake audio device module.", INFO);
       InitFactory();
   }

   //通過Java擷取音頻資料的AudioDeviceModule,之後會詳細講其具體的實作
   rtc::scoped_refptr<webrtc::AudioDeviceModule> RTC::InitJavaAudioDeviceModule(jobject audio_capturer) {
       RTC_DCHECK(worker_thread.get() == rtc::Thread::Current());
       WEBRTC_LOG("Create fake audio device module.", INFO);
       auto result = new rtc::RefCountedObject<FakeAudioDeviceModule>(
               FakeAudioDeviceModule::CreateJavaCapturerWrapper(audio_capturer),
               FakeAudioDeviceModule::CreateDiscardRenderer(44100));
       WEBRTC_LOG("Create fake audio device module finished.", INFO);
       is_connect_to_audio_card = true;
       return result;
   }

   ...
   //釋放AudioDeviceModule的過程
   worker_thread->Invoke<void>(RTC_FROM_HERE, rtc::Bind(&RTC::ReleaseAudioDeviceModule, this));
   ...

   //因為audio_device_module是以rtc::RefCountedObject的形式存儲的,它其實是一個計數指針,當該指針的引用數為0時,會自動調用對應執行個體的析構函數,是以我們在這裡隻需要将其指派為nullptr即可
   void RTC::ReleaseAudioDeviceModule() {
       RTC_DCHECK(worker_thread.get() == rtc::Thread::Current());
       audio_device_module = nullptr;
   }           

有了三個關鍵線程和AudioDeviceModule之後,就可以建立PeerConnectionFactory了,我這裡因為業務的需要,會有一些端口的限制,我也在這裡進行了初始化,我們将在建立PortAllocator的時候使用它。看到這裡您可能會有疑惑,為什麼視訊采集的注入和音頻采集的注入不是在同一個地方進行的,那麼你不是一個人,我也很疑惑=。=,我甚至覺得SocketFactory也應該丢到PeerConnectionFactory裡管理,這樣就不用每次建立PeerConnection的時候自己建立一個PortAllocator。

void RTC::InitFactory() {
       //建立帶端口和IP限制的SocketFacotry
       socket_factory.reset(
               new rtc::SocketFactoryWrapper(network_thread.get(), this->white_private_ip_prefix, this->min_port,
                                             this->max_port));
       network_manager.reset(new rtc::BasicNetworkManager());
       //這裡使用到了我自己實作的視訊編碼器,這部分我也會在後續進行詳細介紹
       peer_connection_factory = webrtc::CreatePeerConnectionFactory(
               network_thread.get(), worker_thread.get(), signaling_thread.get(), audio_device_module,
               webrtc::CreateBuiltinAudioEncoderFactory(), webrtc::CreateBuiltinAudioDecoderFactory(),
               CreateVideoEncoderFactory(hardware_accelerate), CreateVideoDecoderFactory(),
               nullptr, nullptr);
   }           

誠然,在建立PeerConnectionFactory的過程中,有許多和我想法不一樣的接口設計,我覺得可能是因為我的使用場景并不是正常使用場景,這樣WebRTC的接口就顯得不是很順手。總之,PeerConnectionFactory也算是整出來了,整理一下整個過程就是,建立線程->建立音頻采集子產品->建立EncoderFactory->執行個體化PeerConnectionFactory。

建立PeerConnection

有了PeerConnectionFactory之後,我們就可以通過它來建立連接配接了。在這一步,我們需要提供Ice Server的相關資訊,而且我在這裡使用到了上一步中建立的SocketFactory來建立PortAllocator,進而達到了限制端口的目的。此外我還在這一步中通過調用PeerConnection的API,添加了最大傳輸速度的限制。

//建立PeerConnection
   PeerConnection *
   RTC::CreatePeerConnection(PeerConnectionObserver *peerConnectionObserver, std::string uri,
                             std::string username, std::string password, int max_bit_rate) {
       //傳遞Ice Server資訊
       webrtc::PeerConnectionInterface::RTCConfiguration configuration;
       webrtc::PeerConnectionInterface::IceServer ice_server;
       ice_server.uri = std::move(uri);
       ice_server.username = std::move(username);
       ice_server.password = std::move(password);
       configuration.servers.push_back(ice_server);
       //禁用TCP協定
       configuration.tcp_candidate_policy = webrtc::PeerConnectionInterface::TcpCandidatePolicy::kTcpCandidatePolicyDisabled;
       //減少音頻延遲
       configuration.audio_jitter_buffer_fast_accelerate = true;
       //利用之前建立的SocketFacotry生成PortAllocator達到限制端口的效果
       std::unique_ptr<cricket::PortAllocator> port_allocator(
               new cricket::BasicPortAllocator(network_manager.get(), socket_factory.get()));
       port_allocator->SetPortRange(this->min_port, this->max_port);
       //建立PeerConnection并限制比特率
       return new PeerConnection(peer_connection_factory->CreatePeerConnection(
               configuration, std::move(port_allocator), nullptr, peerConnectionObserver), peerConnectionObserver,
                                 is_connect_to_audio_card, max_bit_rate);
   }

   //調用API限制比特率
   void PeerConnection::ChangeBitrate(int bitrate) {
       auto bit_rate_setting = webrtc::BitrateSettings();
       bit_rate_setting.min_bitrate_bps = 30000;
       bit_rate_setting.max_bitrate_bps = bitrate;
       bit_rate_setting.start_bitrate_bps = bitrate;
       this->peer_connection->SetBitrate(bit_rate_setting);
   }           

建立Audio/VideoSource

這一步我們需要使用PeerConnectionFactory的API來建立Audio/VideoSource。在建立AudioSource時,我可以指定一些音頻參數,而在建立VideoSource時,我們要指定一個VideoCapturer。值得一提的是,需要在SignallingThread建立VideoCapturer

...
   //建立Audio/VideoSource
   audio_source = rtc->CreateAudioSource(GetAudioOptions());
   video_source = rtc->CreateVideoSource(rtc->CreateFakeVideoCapturerInSignalingThread());
   ...

   //擷取預設Audio Configurations
   cricket::AudioOptions PeerConnection::GetAudioOptions() {
       cricket::AudioOptions options;
       options.audio_jitter_buffer_fast_accelerate = absl::optional<bool>(true);
       options.audio_jitter_buffer_max_packets = absl::optional<int>(10);
       options.echo_cancellation = absl::optional<bool>(false);
       options.auto_gain_control = absl::optional<bool>(false);
       options.noise_suppression = absl::optional<bool>(false);
       options.highpass_filter = absl::optional<bool>(false);
       options.stereo_swapping = absl::optional<bool>(false);
       options.typing_detection = absl::optional<bool>(false);
       options.experimental_agc = absl::optional<bool>(false);
       options.extended_filter_aec = absl::optional<bool>(false);
       options.delay_agnostic_aec = absl::optional<bool>(false);
       options.experimental_ns = absl::optional<bool>(false);
       options.residual_echo_detector = absl::optional<bool>(false);
       options.audio_network_adaptor = absl::optional<bool>(true);
       return options;
   }

   //建立AudioSource
   rtc::scoped_refptr<webrtc::AudioSourceInterface> RTC::CreateAudioSource(const cricket::AudioOptions &options) {
       return peer_connection_factory->CreateAudioSource(options);
   }

   //在SignalingThread建立VideoCapturer
   FakeVideoCapturer *RTC::CreateFakeVideoCapturerInSignalingThread() {
       if (video_capturer) {
           return signaling_thread->Invoke<FakeVideoCapturer *>(RTC_FROM_HERE,
                                                                rtc::Bind(&RTC::CreateFakeVideoCapturer, this,
                                                                          video_capturer));
       } else {
           return nullptr;
       }
   }           

建立Audio/VideoTrack

這一步相對來說就很簡單了,以上一步建立的Source作為參數,加個名字就能建立出Audio/VideoTrack。這個接口同樣也是PeerConnectionFactory的。

...
   //建立Audio/VideoTrack
   video_track = rtc->CreateVideoTrack("video_track", video_source.get());
   audio_track = rtc->CreateAudioTrack("audio_track", audio_source);
   ...

   //建立VideoTrack
   rtc::scoped_refptr<webrtc::VideoTrackSourceInterface> RTC::CreateVideoSource(cricket::VideoCapturer *capturer) {
       return peer_connection_factory->CreateVideoSource(capturer);
   }

   //建立AudioTrack
   rtc::scoped_refptr<webrtc::VideoTrackInterface> RTC::CreateVideoTrack(const std::string &label,
                                                                         webrtc::VideoTrackSourceInterface *source) {
       return peer_connection_factory->CreateVideoTrack(label, source);
   }           

建立LocalMediaStream

調用PeerConnectionFactory的API建立LocalMediaStream,并将之前的Audio/VideoTrack添加到該Stream中,最後将其添加到PeerConnection中。

...
   //建立LocalMediaStream
   transport_stream = rtc->CreateLocalMediaStream("stream");
   //添加Audio/VideoTrack
   transport_stream->AddTrack(video_track);
   transport_stream->AddTrack(audio_track);
   //添加Stream到PeerConnection
   peer_connection->AddStream(transport_stream);
   ...           

建立Data Channel

建立Data Channel的過程相比于前面建立音視訊傳輸的過程就簡單多了,調用一個PeerConnection的API就建立出來了,在建立的時候可以指令一些配置項,主要是用來限制該Data Channel的可靠性。需要注意的是,一個Data Channel在用戶端這裡會有兩個對象一個代表本地端,一個代表遠端,本地端的DataChannel對象通過CreateDataChannel獲得,遠端的DataChannel通過PeerConnection的OnDataChannel回調獲得。當需要發送資料時,調用DataChannel的Send接口,當遠端發送資料過來時,會觸發OnMessage的回調函數。

//建立Data Channel
   DataChannel *
   PeerConnection::CreateDataChannel(std::string label, webrtc::DataChannelInit config, DataChannelObserver *observer) {
       rtc::scoped_refptr<webrtc::DataChannelInterface> data_channel = peer_connection->CreateDataChannel(label, &config);
       data_channel->RegisterObserver(observer);
       return new DataChannel(data_channel, observer);
   }

   //可配置内容
   struct DataChannelInit {
     // Deprecated. Reliability is assumed, and channel will be unreliable if
     // maxRetransmitTime or MaxRetransmits is set.
     bool reliable = false;

     // True if ordered delivery is required.
     bool ordered = true;

     // The max period of time in milliseconds in which retransmissions will be
     // sent. After this time, no more retransmissions will be sent. -1 if unset.
     //
     // Cannot be set along with |maxRetransmits|.
     int maxRetransmitTime = -1;

     // The max number of retransmissions. -1 if unset.
     //
     // Cannot be set along with |maxRetransmitTime|.
     int maxRetransmits = -1;

     // This is set by the application and opaque to the WebRTC implementation.
     std::string protocol;

     // True if the channel has been externally negotiated and we do not send an
     // in-band signalling in the form of an "open" message. If this is true, |id|
     // below must be set; otherwise it should be unset and will be negotiated
     // in-band.
     bool negotiated = false;

     // The stream id, or SID, for SCTP data channels. -1 if unset (see above).
     int id = -1;
   };

   //發送資料
   void DataChannel::Send(webrtc::DataBuffer &data_buffer) {
       data_channel->Send(data_buffer);
   }

   // Message received.
   void OnMessage(const webrtc::DataBuffer &buffer) override {
       //C++回調Java時需要将目前線程Attach到一個Java線程上
       JNIEnv *env = ATTACH_CURRENT_THREAD_IF_NEEDED();
       jbyteArray jbyte_array = CHAR_POINTER_2_J_BYTE_ARRAY(env, buffer.data.cdata(),
                                                            static_cast<int>(buffer.data.size()));
       jclass data_buffer = GET_DATA_BUFFER_CLASS();
       jmethodID init_method = env->GetMethodID(data_buffer, "<init>", "([BZ)V");
       jobject data_buffer_object = env->NewObject(data_buffer, init_method,
                                                   jbyte_array,
                                                   buffer.binary);
       jclass observer_class = env->GetObjectClass(java_observer);
       jmethodID java_event_method = env->GetMethodID(observer_class, "onMessage",
                                                      "(Lpackage/name/of/rtc4j/model/DataBuffer;)V");
       //找到對應的回調函數,并執行該函數
       env->CallVoidMethod(java_observer, java_event_method, data_buffer_object);
       //釋放相關引用
       env->ReleaseByteArrayElements(jbyte_array, env->GetByteArrayElements(jbyte_array, nullptr), JNI_ABORT);
       env->DeleteLocalRef(data_buffer_object);
       env->DeleteLocalRef(observer_class);
   }

   //Attach c++線程到Java線程
   JNIEnv *ATTACH_CURRENT_THREAD_IF_NEEDED() {
       JNIEnv *jni = GetEnv();
       if (jni)
           return jni;
       JavaVMAttachArgs args;
       args.version = JNI_VERSION_1_8;
       args.group = nullptr;
       args.name = const_cast<char *>("JNI-RTC");
   // Deal with difference in signatures between Oracle's jni.h and Android's.
   #ifdef _JavaSOFT_JNI_H_  // Oracle's jni.h violates the JNI spec!
       void *env = nullptr;
   #else
       JNIEnv* env = nullptr;
   #endif
       RTC_CHECK(!g_java_vm->AttachCurrentThread(&env, &args)) << "Failed to attach thread";
       RTC_CHECK(env) << "AttachCurrentThread handed back NULL!";
       jni = reinterpret_cast<JNIEnv *>(env);
       return jni;
   }

   JNIEnv *GetEnv() {
       void *env = nullptr;
       jint status = g_java_vm->GetEnv(&env, JNI_VERSION_1_8);
       RTC_CHECK(((env != nullptr) && (status == JNI_OK)) ||
                 ((env == nullptr) && (status == JNI_EDETACHED)))
           << "Unexpected GetEnv return: " << status << ":" << env;
       return reinterpret_cast<JNIEnv *>(env);
   }

   //Detach 目前C++線程對應的Java線程
   void DETACH_CURRENT_THREAD_IF_NEEDED() {
       // This function only runs on threads where |g_jni_ptr| is non-NULL, meaning
       // we were responsible for originally attaching the thread, so are responsible
       // for detaching it now.  However, because some JVM implementations (notably
       // Oracle's http://goo.gl/eHApYT) also use the pthread_key_create mechanism,
       // the JVMs accounting info for this thread may already be wiped out by the
       // time this is called. Thus it may appear we are already detached even though
       // it was our responsibility to detach!  Oh well.
       if (!GetEnv())
           return;
       jint status = g_java_vm->DetachCurrentThread();
       RTC_CHECK(status == JNI_OK) << "Failed to detach thread: " << status;
       RTC_CHECK(!GetEnv()) << "Detaching was a successful no-op???";
   }           

在這一步中,我引入了一些關于Attach Thread和Detach Thread的相關内容,我覺得有必要進行簡單的解釋。之前我們提過,在WebRTC中會有三個主要線程,Worker Thread,Network Thread,Signaling Thread,其中WebRTC的回調都是通過Worker Thread來執行的。

而這個Worker Thread是我們用C++代碼建立的獨立線程,這類線程不像Java調用C++代碼那樣能簡單容易得擷取到JNIEnv,舉個例子:

比如如下代碼:

public class Widget {
   private native void nativeMethod();
   }           

他生成的Native頭檔案裡對應的函數聲明是這個樣子:

JNIEXPORT void JNICALL
   Java_xxxxx_nativeMethod(JNIEnv *env, jobject instance);           

我們可以看到,這個函數聲明中第一個參數就是JNIEnv,我們可以通過它以類似反射的形式調用Java中的函數代碼。而C++中獨立建立的線程,是沒有JNIEnv與之對應的,對于這些線程,如果你想要在其中調用Java代碼,就必須先通過

JavaVM::AttachCurrentThread

,将其Attach到一個Java線程上去,然後就能獲得一個JNIEnv。

需要注意的是對于一個已經綁定到JavaVM上的線程調用

AttachCurrentThread

不會有任何影響。如果你的線程已經綁定到了JavaVM上,你還可以通過調用

JavaVM::GetEnv

擷取 JNIEnv,如果你的線程沒有綁定,這個函數傳回JNI_EDETACHED。最後當我們不再需要該線程調用Java代碼時,需要調用

DetachCurrentThread

來釋放。

PeerConnection建立連接配接

從上一步Stream加入到PeerConnection之後,剩下的工作就是如何利用PeerConnection的API和回調函數與其他用戶端建立起連接配接了。這一步中主要涉及的API就是CreateOffer,CreateAnswer,SetLocalDescription, SetRemoteDescription。在調用CreateOffer,CreateAnswer時,我們需要指定目前用戶端是否接受另一用戶端的Audio/Video,而在我的使用場景中隻會出現Java伺服器給其他用戶端推送音視訊資料這種情況,是以我在使用的時候ReceiveAudio/Video均為false。

void PeerConnection::CreateAnswer(jobject java_observer) {
       create_session_observer->SetGlobalJavaObserver(java_observer, "answer");
       auto options = webrtc::PeerConnectionInterface::RTCOfferAnswerOptions();
       options.offer_to_receive_audio = false;
       options.offer_to_receive_video = false;
       peer_connection->CreateAnswer(create_session_observer, options);
   }

   void PeerConnection::CreateOffer(jobject java_observer) {
       create_session_observer->SetGlobalJavaObserver(java_observer, "offer");
       auto options = webrtc::PeerConnectionInterface::RTCOfferAnswerOptions();
       options.offer_to_receive_audio = false;
       options.offer_to_receive_video = false;
       peer_connection->CreateOffer(create_session_observer, options);
   }

   webrtc::SdpParseError PeerConnection::SetLocalDescription(JNIEnv *env, jobject sdp) {
       webrtc::SdpParseError error;
       webrtc::SessionDescriptionInterface *session_description(
               webrtc::CreateSessionDescription(GET_STRING_FROM_OBJECT(env, sdp, const_cast<char *>("type")),
                                                GET_STRING_FROM_OBJECT(env, sdp, const_cast<char *>("sdp")), &error));
       peer_connection->SetLocalDescription(set_session_description_observer, session_description);
       return error;
   }

   webrtc::SdpParseError PeerConnection::SetRemoteDescription(JNIEnv *env, jobject sdp) {
       webrtc::SdpParseError error;
       webrtc::SessionDescriptionInterface *session_description(
               webrtc::CreateSessionDescription(GET_STRING_FROM_OBJECT(env, sdp, const_cast<char *>("type")),
                                                GET_STRING_FROM_OBJECT(env, sdp, const_cast<char *>("sdp")), &error));
       peer_connection->SetRemoteDescription(set_session_description_observer, session_description);
       return error;
   }           

在Java端一般來說我都是以如下方式交換SDP:

//添加Stream到PeerConnection之後
   sessionRTCMap.get(headerAccessor.getSessionId()).getPeerConnection().createOffer(sdp -> executor.submit(() -> {
       try {
           sessionRTCMap.get(headerAccessor.getSessionId()).getPeerConnection().setLocalDescription(sdp);
           sendMessage(headerAccessor.getSessionId(), SDP_DESTINATION, sdp);
       } catch (Exception e) {
           log.error("{}", e);
       }
   }));

   //接收到遠端傳過來的Answer SDP之後
   SessionDescription sessionDescription = JSON.parseObject((String) requestResponse.getData(), SessionDescription.class);
   sessionRTCMap.get(headerAccessor.getSessionId()).getPeerConnection().setRemoteDescription(sessionDescription);           

走到這一步,正常來說,整個連接配接就已經連通了。接下來我會講一下我是如何釋放所有相關資源,作為正常使用場景的完結。這個部分也有不少坑,我當時由于對WebRTC指針管理機制的不熟悉,頻繁出現洩露問題和操作非法指針問題,說出來都是淚啊T.T。

釋放所有相關資源

我們以Java中的釋放過程作為起點,來浏覽一下整個資源釋放的過程。

public void releaseResource() {
       lock.lock();
       try {
           //
           if (videoDataChannel != null) { //如果有使用DataChannel,先釋放遠端的DataChannel對象
               videoDataChannel.close();
               videoDataChannel = null;
           }
           log.info("Release remote video data channel");
           if (localVideoDataChannel != null) { //如果有使用DataChannel,接着釋放本地的DataChannel對象
               localVideoDataChannel.close();
               localVideoDataChannel = null;
           }
           log.info("Release local video data channel");
           if (peerConnection != null) { //釋放PeerConnection對象
               peerConnection.close();
               peerConnection = null;
           }
           log.info("Release peer connection");
           if (rtc != null) { //釋放PeerConnectFactory相關對象
               rtc.close();
           }
           log.info("Release rtc");
       } catch (Exception ignored) {
       }finally {
           destroyed = true;
           lock.unlock();
       }
   }           

然後是C++的相關釋放代碼:

DataChannel::~DataChannel() {
       data_channel->UnregisterObserver(); //先解除注冊進去的觀察者
       delete data_channel_observer; //銷毀觀察者對象
       data_channel->Close(); //關閉Data Channel
       //rtc::scoped_refptr<webrtc::DataChannelInterface> data_channel; (Created by webrtc::PeerConnectionInterface::CreateDataChannel)
       data_channel = nullptr; //銷毀Data Channel對象(計數指針)
   }

   PeerConnection::~PeerConnection() {
       peer_connection->Close(); //關閉PeerConnection
       //rtc::scoped_refptr<webrtc::PeerConnectionInterface> peer_connection; (Created by webrtc::PeerConnectionFactoryInterface::CreatePeerConnection)
       peer_connection = nullptr; //銷毀PeerConnection對象(計數指針)
       delete peer_connection_observer; //銷毀使用過的觀察者
       delete set_session_description_observer; //銷毀使用過的觀察者
       delete create_session_observer; //銷毀使用過的觀察者
   }

   RTC::~RTC() {
       //rtc::scoped_refptr<webrtc::PeerConnectionFactoryInterface> peer_connection_factory; (Created by webrtc::CreatePeerConnectionFactory)
       peer_connection_factory = nullptr; //釋放PeerConnectionFactory
       WEBRTC_LOG("Destroy peer connection factory", INFO);
       worker_thread->Invoke<void>(RTC_FROM_HERE, rtc::Bind(&RTC::ReleaseAudioDeviceModule, this)); //在Worker Thread釋放AudioDeviceModule,因為是在這個線程建立的
       signaling_thread->Invoke<void>(RTC_FROM_HERE, rtc::Bind(&RTC::DetachCurrentThread, this)); //Detach signalling thread
       worker_thread->Invoke<void>(RTC_FROM_HERE, rtc::Bind(&RTC::DetachCurrentThread, this)); //Detach worker thread
       network_thread->Invoke<void>(RTC_FROM_HERE, rtc::Bind(&RTC::DetachCurrentThread, this)); //Detach network thread
       worker_thread->Stop(); //停止線程
       signaling_thread->Stop(); //停止線程
       network_thread->Stop(); //停止線程
       worker_thread.reset(); //銷毀線程(計數指針)
       signaling_thread.reset(); //銷毀線程(計數指針)
       network_thread.reset(); //銷毀線程(計數指針)
       network_manager = nullptr; //銷毀Network Manager(計數指針)
       socket_factory = nullptr; //銷毀Socket Factory(計數指針)
       WEBRTC_LOG("Stop threads", INFO);
       if (video_capturer) {
           JNIEnv *env = ATTACH_CURRENT_THREAD_IF_NEEDED();
           env->DeleteGlobalRef(video_capturer); //銷毀對VideoCapturer的Java對象引用,這個對象是我儲存在RTC類下的全局引用env->NewGlobalRef(video_capturer)
           //這裡沒有銷毀AudioCapturer的Java引用是因為我将其引用儲存在AudioDeviceModule中了
       }
   }           

至此,如果您隻會涉及到正常WebRTC使用場景的話,那麼我想您已經掌握了如何在Java中調用WebRTC的Native APIs。接下來的部分,是我針對業務場景進行的一些API改動,如果您對這部分也感興趣,就請聽我慢慢道來。

經驗分享

這裡分享一點經驗,是以我當時在進行這部分開發的時候,先是參考

Javascript中WebRTC的使用

,簡單的熟悉了一下Native APIs,此外還參考了

NodeJS的實作

,遇到了問題就去Google的論壇

WebRTC-Discuss

,如果上述流程均沒找到解決方案,就針對想要實作的功能走讀所有相關代碼=。=。

文章說明

更多有價值的文章均收錄于

貝貝貓的文章目錄
在 Java 中使用 WebRTC 傳輸視訊——使用 Native API

版權聲明: 本部落格所有文章除特别聲明外,均采用 BY-NC-SA 許可協定。轉載請注明出處!

創作聲明: 本文基于下列所有參考内容進行創作,其中可能涉及複制、修改或者轉換,圖檔均來自網絡,如有侵權請聯系我,我會第一時間進行删除。

參考内容

[1]

JNI的替代者—使用JNA通路Java外部功能接口

[2]

Linux共享對象之編譯參數fPIC

[3]

Android JNI 使用總結

[4]

FFmpeg 倉庫