本文主要抛磚引玉,粗略介紹下Android平台RTMP/RTSP播放器中解碼和繪制相關的部分(Github)。
解碼
提到解碼,大家都知道軟硬解,甚至一些公司覺得硬解碼已經足夠通用,慢慢抛棄軟解了,如果考慮到裝置比對,軟硬解碼都支援,是個不錯的選擇,為此,大牛直播SDK在開發這塊的時候,分類是這樣的:
1. 軟解碼:解碼後擷取到原始資料,可進行後續的原始資料回調和快照等操作;
2. 硬解碼:解碼後擷取到原始資料,可進行後續的原始資料回調和快照等操作;
3. 硬解碼:設定surface模式,直接render到設定的surface上,不可進行快照和解碼後資料回調操作。
大家可能會疑惑,有了模式2,幹嘛要再支援模式3呢?模式2和3分别有什麼優勢呢?
硬解碼直接設定surface模式,相對來說,大多晶片支援更好,解碼通用性更好,而且減少了資料拷貝,資源占用更低,缺點是無法獲得解碼後的原始資料,更像個黑盒操作;模式2兼顧了硬解碼資源占用(相對軟解)和二次操作原始資料能力(如針對解碼後的yuv/rgb資料二次處理),解碼通用性相對模式3略差,但資料處理更靈活。
相關接口:
/**
* Set Video H.264 HW decoder(設定H.264硬解碼)
*
* @param handle: return value from SmartPlayerOpen()
*
* @param isHWDecoder: 0: software decoder; 1: hardware decoder.
*
* @return {0} if successful
*/
public native int SetSmartPlayerVideoHWDecoder(long handle, int isHWDecoder);
/**
* Set Video H.265(hevc) HW decoder(設定H.265硬解碼)
*
* @param handle: return value from SmartPlayerOpen()
*
* @param isHevcHWDecoder: 0: software decoder; 1: hardware decoder.
*
* @return {0} if successful
*/
public native int SetSmartPlayerVideoHevcHWDecoder(long handle, int isHevcHWDecoder);
/**
* Set Surface view(設定播放的surfaceview).
*
* @param handle: return value from SmartPlayerOpen()
*
* @param surface: surface view
*
* <pre> NOTE: if not set or set surface with null, it will playback audio only. </pre>
*
* @return {0} if successful
*/
public native int SmartPlayerSetSurface(long handle, Object surface);
考慮到不是所有裝置都支援硬解,大牛直播SDK的設計思路是先做硬解檢測,檢測到不支援,直接切換到軟解模式。
繪制
大牛直播SDK的RTMP和RTSP播放器繪制這塊,支援兩種模式,普通的SurfaceView和GLSurface,普通的surface相容性更好,GLSurface繪制相對來說更細膩,此外,普通的surface模式下,還支援了一些抗鋸齒參數設定。兩種模式下,都設計了視訊畫面的填充模式設定選項(是否等比例顯示),具體接口設計如下:
/**
* 設定視訊畫面的填充模式,如填充整個view、等比例填充view,如不設定,預設填充整個view
* @param handle: return value from SmartPlayerOpen()
* @param render_scale_mode 0: 填充整個view; 1: 等比例填充view, 預設值是0
* @return {0} if successful
*/
public native int SmartPlayerSetRenderScaleMode(long handle, int render_scale_mode);
/**
* 設定SurfaceView模式下(NTRenderer.CreateRenderer第二個參數傳false的情況),render類型
*
* @param handle: return value from SmartPlayerOpen()
*
* @param format: 0: RGB565格式,如不設定,預設此模式; 1: ARGB8888格式
*
* @return {0} if successful
*/
public native int SmartPlayerSetSurfaceRenderFormat(long handle, int format);
/**
* 設定SurfaceView模式下(NTRenderer.CreateRenderer第二個參數傳false的情況),抗鋸齒效果,注意:抗鋸齒模式開啟後,可能會影像性能,請慎用
*
* @param handle: return value from SmartPlayerOpen()
*
* @param isEnableAntiAlias: 0: 如不設定,預設不開啟抗鋸齒模式; 1: 開啟抗鋸齒模式
*
* @return {0} if successful
*/
public native int SmartPlayerSetSurfaceAntiAlias(long handle, int isEnableAntiAlias);
音頻輸出這塊,可以考慮audiotrack和opensl es,考慮到通用性,可以選擇audiotrack模式,當然最好是設定個選項,使用者自行選擇:
/**
* Set AudioOutput Type(設定audio輸出類型)
*
* @param handle: return value from SmartPlayerOpen()
*
* @param use_audiotrack:
*
* <pre> NOTE: if use_audiotrack with 0: it will use auto-select output devices; if with 1: will use audio-track mode. </pre>
*
* @return {0} if successful
*/
public native int SmartPlayerSetAudioOutputType(long handle, int use_audiotrack);
視訊view反轉/旋轉
/**
* 設定視訊垂直反轉
*
* @param handle: return value from SmartPlayerOpen()
*
* @param is_flip: 0: 不反轉, 1: 反轉
*
* @return {0} if successful
*/
public native int SmartPlayerSetFlipVertical(long handle, int is_flip);
/**
* 設定視訊水準反轉
*
* @param handle: return value from SmartPlayerOpen()
*
* @param is_flip: 0: 不反轉, 1: 反轉
*
* @return {0} if successful
*/
public native int SmartPlayerSetFlipHorizontal(long handle, int is_flip);
/**
* 設定順時針旋轉, 注意除了0度之外, 其他角度都會額外消耗性能
*
* @param handle: return value from SmartPlayerOpen()
*
* @param degress: 目前支援 0度,90度, 180度, 270度 旋轉
*
* @return {0} if successful
*/
public native int SmartPlayerSetRotation(long handle, int degress);
解碼後原始資料回調
在有些場景下,開發者需要針對解碼後的YUV/RGB或者PCM資料進行處理,這個時候,需要設計針對解碼後資料回調的接口模型:
/**
* Set External Render(設定回調YUV/RGB資料)
*
* @param handle: return value from SmartPlayerOpen()
*
* @param external_render: External Render
*
* @return {0} if successful
*/
public native int SmartPlayerSetExternalRender(long handle, Object external_render);
/**
* Set External Audio Output(設定回調PCM資料)
*
* @param handle: return value from SmartPlayerOpen()
*
* @param external_audio_output: External Audio Output
*
* @return {0} if successful
*/
public native int SmartPlayerSetExternalAudioOutput(long handle, Object external_audio_output);
具體調用執行個體:
//libPlayer.SmartPlayerSetExternalRender(playerHandle, new RGBAExternalRender());
//libPlayer.SmartPlayerSetExternalRender(playerHandle, new I420ExternalRender());
拿到原始資料,進行二次操作(如人臉識别等):
class RGBAExternalRender implements NTExternalRender {
// public static final int NT_FRAME_FORMAT_RGBA = 1;
// public static final int NT_FRAME_FORMAT_ABGR = 2;
// public static final int NT_FRAME_FORMAT_I420 = 3;
private int width_ = 0;
private int height_ = 0;
private int row_bytes_ = 0;
private ByteBuffer rgba_buffer_ = null;
@Override
public int getNTFrameFormat() {
Log.i(TAG, "RGBAExternalRender::getNTFrameFormat return "
+ NT_FRAME_FORMAT_RGBA);
return NT_FRAME_FORMAT_RGBA;
}
@Override
public void onNTFrameSizeChanged(int width, int height) {
width_ = width;
height_ = height;
row_bytes_ = width_ * 4;
Log.i(TAG, "RGBAExternalRender::onNTFrameSizeChanged width_:"
+ width_ + " height_:" + height_);
rgba_buffer_ = ByteBuffer.allocateDirect(row_bytes_ * height_);
}
@Override
public ByteBuffer getNTPlaneByteBuffer(int index) {
if (index == 0) {
return rgba_buffer_;
} else {
Log.e(TAG,
"RGBAExternalRender::getNTPlaneByteBuffer index error:"
+ index);
return null;
}
}
@Override
public int getNTPlanePerRowBytes(int index) {
if (index == 0) {
return row_bytes_;
} else {
Log.e(TAG,
"RGBAExternalRender::getNTPlanePerRowBytes index error:"
+ index);
return 0;
}
}
public void onNTRenderFrame(int width, int height, long timestamp) {
if (rgba_buffer_ == null)
return;
rgba_buffer_.rewind();
// copy buffer
// test
// byte[] test_buffer = new byte[16];
// rgba_buffer_.get(test_buffer);
Log.i(TAG, "RGBAExternalRender:onNTRenderFrame w=" + width + " h="
+ height + " timestamp=" + timestamp);
// Log.i(TAG, "RGBAExternalRender:onNTRenderFrame rgba:" +
// bytesToHexString(test_buffer));
}
}
class I420ExternalRender implements NTExternalRender {
// public static final int NT_FRAME_FORMAT_RGBA = 1;
// public static final int NT_FRAME_FORMAT_ABGR = 2;
// public static final int NT_FRAME_FORMAT_I420 = 3;
private int width_ = 0;
private int height_ = 0;
private int y_row_bytes_ = 0;
private int u_row_bytes_ = 0;
private int v_row_bytes_ = 0;
private ByteBuffer y_buffer_ = null;
private ByteBuffer u_buffer_ = null;
private ByteBuffer v_buffer_ = null;
@Override
public int getNTFrameFormat() {
Log.i(TAG, "I420ExternalRender::getNTFrameFormat return "
+ NT_FRAME_FORMAT_I420);
return NT_FRAME_FORMAT_I420;
}
@Override
public void onNTFrameSizeChanged(int width, int height) {
width_ = width;
height_ = height;
y_row_bytes_ = (width_ + 15) & (~15);
u_row_bytes_ = ((width_ + 1) / 2 + 15) & (~15);
v_row_bytes_ = ((width_ + 1) / 2 + 15) & (~15);
y_buffer_ = ByteBuffer.allocateDirect(y_row_bytes_ * height_);
u_buffer_ = ByteBuffer.allocateDirect(u_row_bytes_
* ((height_ + 1) / 2));
v_buffer_ = ByteBuffer.allocateDirect(v_row_bytes_
* ((height_ + 1) / 2));
Log.i(TAG, "I420ExternalRender::onNTFrameSizeChanged width_="
+ width_ + " height_=" + height_ + " y_row_bytes_="
+ y_row_bytes_ + " u_row_bytes_=" + u_row_bytes_
+ " v_row_bytes_=" + v_row_bytes_);
}
@Override
public ByteBuffer getNTPlaneByteBuffer(int index) {
if (index == 0) {
return y_buffer_;
} else if (index == 1) {
return u_buffer_;
} else if (index == 2) {
return v_buffer_;
} else {
Log.e(TAG, "I420ExternalRender::getNTPlaneByteBuffer index error:" + index);
return null;
}
}
@Override
public int getNTPlanePerRowBytes(int index) {
if (index == 0) {
return y_row_bytes_;
} else if (index == 1) {
return u_row_bytes_;
} else if (index == 2) {
return v_row_bytes_;
} else {
Log.e(TAG, "I420ExternalRender::getNTPlanePerRowBytes index error:" + index);
return 0;
}
}
public void onNTRenderFrame(int width, int height, long timestamp) {
if (y_buffer_ == null)
return;
if (u_buffer_ == null)
return;
if (v_buffer_ == null)
return;
y_buffer_.rewind();
u_buffer_.rewind();
v_buffer_.rewind();
/*
if ( !is_saved_image )
{
is_saved_image = true;
int y_len = y_row_bytes_*height_;
int u_len = u_row_bytes_*((height_+1)/2);
int v_len = v_row_bytes_*((height_+1)/2);
int data_len = y_len + (y_row_bytes_*((height_+1)/2));
byte[] nv21_data = new byte[data_len];
byte[] u_data = new byte[u_len];
byte[] v_data = new byte[v_len];
y_buffer_.get(nv21_data, 0, y_len);
u_buffer_.get(u_data, 0, u_len);
v_buffer_.get(v_data, 0, v_len);
int[] strides = new int[2];
strides[0] = y_row_bytes_;
strides[1] = y_row_bytes_;
int loop_row_c = ((height_+1)/2);
int loop_c = ((width_+1)/2);
int dst_row = y_len;
int src_v_row = 0;
int src_u_row = 0;
for ( int i = 0; i < loop_row_c; ++i)
{
int dst_pos = dst_row;
for ( int j = 0; j <loop_c; ++j )
{
nv21_data[dst_pos++] = v_data[src_v_row + j];
nv21_data[dst_pos++] = u_data[src_u_row + j];
}
dst_row += y_row_bytes_;
src_v_row += v_row_bytes_;
src_u_row += u_row_bytes_;
}
String imagePath = "/sdcard" + "/" + "testonv21" + ".jpeg";
Log.e(TAG, "I420ExternalRender::begin test save iamge++ image_path:" + imagePath);
try
{
File file = new File(imagePath);
FileOutputStream image_os = new FileOutputStream(file);
YuvImage image = new YuvImage(nv21_data, ImageFormat.NV21, width_, height_, strides);
image.compressToJpeg(new android.graphics.Rect(0, 0, width_, height_), 50, image_os);
image_os.flush();
image_os.close();
}
catch(IOException e)
{
e.printStackTrace();
}
Log.e(TAG, "I420ExternalRender::begin test save iamge--");
}
*/
Log.i(TAG, "I420ExternalRender::onNTRenderFrame w=" + width + " h=" + height + " timestamp=" + timestamp);
// copy buffer
// test
// byte[] test_buffer = new byte[16];
// y_buffer_.get(test_buffer);
// Log.i(TAG, "I420ExternalRender::onNTRenderFrame y data:" + bytesToHexString(test_buffer));
// u_buffer_.get(test_buffer);
// Log.i(TAG, "I420ExternalRender::onNTRenderFrame u data:" + bytesToHexString(test_buffer));
// v_buffer_.get(test_buffer);
// Log.i(TAG, "I420ExternalRender::onNTRenderFrame v data:" + bytesToHexString(test_buffer));
}
}
總結
以上就是Android平台開發RTMP/RTSP播放器時,針對解碼和繪制部分的一點考量,算是抛磚引玉,感興趣的開發者可酌情參考。