它支援HTTP streaming或者是local的檔案,
video和audio分開之後再丟給MediaCodec去decode,
首先看一下video部份,
video需要render,所以還另外需要加上SurfaceView,
private MediaExtractor extractorVideo;
private MediaCodec decoderVideo;
extractorVideo = new MediaExtractor();
extractorVideo.setDataSource("myTest.mp4");
for (int i = 0; i < extractorVideo.getTrackCount(); i++) {
MediaFormat format = extractorVideo.getTrackFormat(i);
String mime = format.getString(MediaFormat.KEY_MIME);
Log.d(TAG, "mime=>"+mime);
if (mime.startsWith("video/")) {
videoTrack = i;
extractorVideo.selectTrack(videoTrack);
decoderVideo = MediaCodec.createDecoderByType(mime);
decoderVideo.configure(format, surface, null, 0);
break;
}
}
if (videoTrack >=0) {
if(decoderVideo == null)
{
Log.e(TAG, "Can't find video info!");
return;
}
else
decoderVideo.start();
}
extractorVideo 會根據MIME的資訊把video track找出來,把video format和surface透過decoderVideo.configure()傳給decoderVideo,
之後decoderVideo便會接手decode和render的工作,
接下來就是decode部份:
ByteBuffer[] inputBuffersVideo=null;
ByteBuffer[] outputBuffersVideo=null;
BufferInfo infoVideo=null;
if (videoTrack >=0)
{
inputBuffersVideo = decoderVideo.getInputBuffers();
outputBuffersVideo = decoderVideo.getOutputBuffers();
infoVideo = new BufferInfo();
}
boolean isEOS = false;
long startMs = System.currentTimeMillis();
while (!Thread.interrupted()) {
if (videoTrack >=0)
{
if (!isEOS) {
int inIndex=-1;
try {
inIndex = decoderVideo.dequeueInputBuffer(10000);
} catch (Exception e) {
e.printStackTrace();
}
if (inIndex >= 0) {
ByteBuffer buffer = inputBuffersVideo[inIndex];
int sampleSize = extractorVideo.readSampleData(buffer, 0);
if (sampleSize < 0) {
// We shouldn't stop the playback at this point, just pass the EOS
// flag to decoder, we will get it again from the dequeueOutputBuffer
decoderVideo.queueInputBuffer(inIndex, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
buffer.clear();
isEOS = true;
} else {
long current = System.currentTimeMillis();
decoderVideo.queueInputBuffer(inIndex, 0, sampleSize, extractorVideo.getSampleTime(), 0);
buffer.clear();
extractorVideo.advance();
}
}
}
int outIndex=-1;
try {
outIndex = decoderVideo.dequeueOutputBuffer(infoVideo,10000);
} catch (Exception e) {
e.printStackTrace();
}
switch (outIndex) {
case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
Log.d(TAG, "INFO_OUTPUT_BUFFERS_CHANGED");
outputBuffersVideo = decoderVideo.getOutputBuffers();
break;
case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
Log.d(TAG, "New format " + decoderVideo.getOutputFormat());
break;
case MediaCodec.INFO_TRY_AGAIN_LATER:
Log.d(TAG, "dequeueOutputBuffer timed out!");
break;
default:
if(outIndex >=0)
{
ByteBuffer buffer = outputBuffersVideo[outIndex];
buffer.clear();
decoderVideo.releaseOutputBuffer(outIndex, true);
// We use a very simple clock to keep the video FPS, or the video
// playback will be too fast
while (infoVideo.presentationTimeUs / 1000 > (System.currentTimeMillis() - startMs)) {
try {
sleep(10);
} catch (InterruptedException e) {
e.printStackTrace();
Thread.currentThread().interrupt();
break;
}
}
}
break;
}
// All decoded frames have been rendered, we can stop playing now
if ((infoVideo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
Log.d(TAG, "OutputBuffer BUFFER_FLAG_END_OF_STREAM");
break;
}
}
}
if (videoTrack >=0)
{
decoderVideo.stop();
decoderVideo.release();
}
extractorVideo.release();
先呼叫decoderVideo.dequeueInputBuffer(10000)取得一個input buffer,
利用extractorVideo.readSampleData(buffer, 0)將資料放到buffer去,
然後再呼叫decoderVideo.queueInputBuffer()將資料queue到後面的decode工作, decoderVideo.dequeueOutputBuffer(infoVideo,10000)如果傳回非-1的值,
表示decode成功,
然後透過decoderVideo.releaseOutputBuffer(outIndex, true)將它render到surface,
這樣video的工作就完成了
請問這可以用在RTSP的協定上嗎? 有試過在serDataSource加入RTSP的Uri結果會出錯
回覆刪除http://cruxintw.blogspot.tw/2015/03/androidmediaplayer-rtsp.html
回覆刪除