audio - Android MediaExtractor and mp3 stream -


i trying play mp3 stream using mediaextractor/mediacodec. mediaplayer out of question due latency , long buffer size.

the sample code have found this: http://dpsm.wordpress.com/category/android/

the code samples parcial (?) , use file instead of stream.

i have been trying adapt example play audio stream can't head around how supposed work. android documentation usual no help.

i understand first information stream, presumably setup audiotrack information ( code sample include audiotrack initialization ?) , open input buffer , output buffer.

i have recreated code this, can guess missing parts, no audio comes out of this.

can point me in right direction understand how supposed work?

public final string log_tag = "mediadecoderexample"; private static int timeout_us = -1; mediacodec codec; mediaextractor extractor;  mediaformat format; bytebuffer[] codecinputbuffers; bytebuffer[] codecoutputbuffers; boolean sawinputeos = false; boolean sawoutputeos = false; audiotrack maudiotrack; bufferinfo info;  @override protected void oncreate(bundle savedinstancestate) {     super.oncreate(savedinstancestate);     setcontentview(r.layout.activity_main);      string url = "http://82.201.100.9:8000/radio538_web_mp3";     extractor = new mediaextractor();      try {         extractor.setdatasource(url);     } catch (ioexception e) {     }      format = extractor.gettrackformat(0);     string mime = format.getstring(mediaformat.key_mime);     int samplerate = format.getinteger(mediaformat.key_sample_rate);      log.i(log_tag, "===========================");     log.i(log_tag, "url "+url);     log.i(log_tag, "mime type : "+mime);     log.i(log_tag, "sample rate : "+samplerate);     log.i(log_tag, "===========================");      codec = mediacodec.createdecoderbytype(mime);     codec.configure(format, null , null , 0);     codec.start();      codecinputbuffers = codec.getinputbuffers();     codecoutputbuffers = codec.getoutputbuffers();      extractor.selecttrack(0);       maudiotrack = new audiotrack(             audiomanager.stream_music,              samplerate,              audioformat.channel_out_stereo,              audioformat.encoding_pcm_16bit,              audiotrack.getminbuffersize (                     samplerate,                      audioformat.channel_out_stereo,                      audioformat.encoding_pcm_16bit                     ),              audiotrack.mode_stream             );      info = new bufferinfo();       input();     output();   }  private void output() {     final int res = codec.dequeueoutputbuffer(info, timeout_us);     if (res >= 0) {         int outputbufindex = res;         bytebuffer buf = codecoutputbuffers[outputbufindex];          final byte[] chunk = new byte[info.size];         buf.get(chunk); // read buffer @ once         buf.clear(); // ** must do!!! otherwise next time same buffer bad things happen          if (chunk.length > 0) {             maudiotrack.write(chunk, 0, chunk.length);         }         codec.releaseoutputbuffer(outputbufindex, false /* render */);          if ((info.flags & mediacodec.buffer_flag_end_of_stream) != 0) {             sawoutputeos = true;         }     } else if (res == mediacodec.info_output_buffers_changed) {         codecoutputbuffers = codec.getoutputbuffers();     } else if (res == mediacodec.info_output_format_changed) {         final mediaformat oformat = codec.getoutputformat();         log.d(log_tag, "output format has changed " + oformat);         maudiotrack.setplaybackrate(oformat.getinteger(mediaformat.key_sample_rate));     }  }  private void input() {     log.i(log_tag, "inputloop()");     int inputbufindex = codec.dequeueinputbuffer(timeout_us);     log.i(log_tag, "inputbufindex : "+inputbufindex);      if (inputbufindex >= 0) {            bytebuffer dstbuf = codecinputbuffers[inputbufindex];          int samplesize = extractor.readsampledata(dstbuf, 0);         log.i(log_tag, "samplesize : "+samplesize);         long presentationtimeus = 0;         if (samplesize < 0) {             log.i(log_tag, "saw input end of stream!");             sawinputeos = true;             samplesize = 0;         } else {             presentationtimeus = extractor.getsampletime();             log.i(log_tag, "presentationtimeus "+presentationtimeus);         }          codec.queueinputbuffer(inputbufindex,                                0, //offset                                samplesize,                                presentationtimeus,                                sawinputeos ? mediacodec.buffer_flag_end_of_stream : 0);         if (!sawinputeos) {             log.i(log_tag, "extractor.advance()");             extractor.advance();          }      }  } } 

edit: adding logcat output ideas.

03-10 16:47:54.115: i/mediadecoderexample(24643): =========================== 03-10 16:47:54.115: i/mediadecoderexample(24643): url .... 03-10 16:47:54.115: i/mediadecoderexample(24643): mime type : audio/mpeg 03-10 16:47:54.115: i/mediadecoderexample(24643): sample rate : 32000 03-10 16:47:54.115: i/mediadecoderexample(24643): =========================== 03-10 16:47:54.120: i/omxclient(24643): using client-side omx mux. 03-10 16:47:54.150: i/reverb(24643):  getpid() 24643, ipcthreadstate::self()->getcallingpid() 24643 03-10 16:47:54.150: i/mediadecoderexample(24643): inputloop() 03-10 16:47:54.155: i/mediadecoderexample(24643): inputbufindex : 0 03-10 16:47:54.155: i/mediadecoderexample(24643): samplesize : 432 03-10 16:47:54.155: i/mediadecoderexample(24643): presentationtimeus 0 03-10 16:47:54.155: i/mediadecoderexample(24643): extractor.advance() 03-10 16:47:59.085: d/httpbase(24643): [2] network bandwidth = 187 kbps 03-10 16:47:59.085: d/nucachedsource2(24643): remaining (64k), highwaterthreshold (20480k) 03-10 16:48:04.635: d/httpbase(24643): [3] network bandwidth = 141 kbps 03-10 16:48:04.635: d/nucachedsource2(24643): remaining (128k), highwaterthreshold (20480k) 03-10 16:48:09.930: d/httpbase(24643): [4] network bandwidth = 127 kbps 03-10 16:48:09.930: d/nucachedsource2(24643): remaining (192k), highwaterthreshold (20480k) 03-10 16:48:15.255: d/httpbase(24643): [5] network bandwidth = 120 kbps 03-10 16:48:15.255: d/nucachedsource2(24643): remaining (256k), highwaterthreshold (20480k) 03-10 16:48:20.775: d/httpbase(24643): [6] network bandwidth = 115 kbps 03-10 16:48:20.775: d/nucachedsource2(24643): remaining (320k), highwaterthreshold (20480k) 03-10 16:48:26.510: d/httpbase(24643): [7] network bandwidth = 111 kbps 03-10 16:48:26.510: d/nucachedsource2(24643): remaining (384k), highwaterthreshold (20480k) 03-10 16:48:31.740: d/httpbase(24643): [8] network bandwidth = 109 kbps 03-10 16:48:31.740: d/nucachedsource2(24643): remaining (448k), highwaterthreshold (20480k) 03-10 16:48:37.260: d/httpbase(24643): [9] network bandwidth = 107 kbps 03-10 16:48:37.260: d/nucachedsource2(24643): remaining (512k), highwaterthreshold (20480k) 03-10 16:48:42.620: d/httpbase(24643): [10] network bandwidth = 106 kbps 03-10 16:48:42.620: d/nucachedsource2(24643): remaining (576k), highwaterthreshold (20480k) 03-10 16:48:48.295: d/httpbase(24643): [11] network bandwidth = 105 kbps 03-10 16:48:48.295: d/nucachedsource2(24643): remaining (640k), highwaterthreshold (20480k) 03-10 16:48:53.735: d/httpbase(24643): [12] network bandwidth = 104 kbps 03-10 16:48:53.735: d/nucachedsource2(24643): remaining (704k), highwaterthreshold (20480k) 03-10 16:48:59.115: d/httpbase(24643): [13] network bandwidth = 103 kbps 03-10 16:48:59.115: d/nucachedsource2(24643): remaining (768k), highwaterthreshold (20480k) 03-10 16:49:04.480: d/httpbase(24643): [14] network bandwidth = 103 kbps 03-10 16:49:04.480: d/nucachedsource2(24643): remaining (832k), highwaterthreshold (20480k) 03-10 16:49:09.955: d/httpbase(24643): [15] network bandwidth = 102 kbps 

the code in oncreate() suggests have misconception how mediacodec works. code currently:

oncreate() {     ...setup...     input();     output(); } 

mediacodec operates on access units. video, each call input/output single frame of video. haven't worked audio, understanding behaves similarly. don't entire file loaded input buffer, , doesn't play stream you; take 1 small piece of file, hand decoder, , hands decoded data (e.g. yuv video buffer or pcm audio data). whatever necessary play data.

so example would, @ best, decode fraction of second of audio. need doing submit-input-get-output in loop proper handling of end-of-stream. can see done video in various bigflake examples. looks code has necessary pieces.

you're using timeout of -1 (infinite), you're going supply 1 buffer of input , wait forever buffer of output. in video wouldn't work -- decoders i've tested seem want 4 buffers of input before they'll produce output -- again haven't worked audio, i'm not sure if expected work. since code hanging i'm guessing it's not. might useful change timeout (say) 10000 , see if hang goes away.

i'm assuming experiment , you're not going in oncreate(). :-)


Comments

Popular posts from this blog

Android layout hidden on keyboard show -

google app engine - 403 Forbidden POST - Flask WTForms -

c - Why would PK11_GenerateRandom() return an error -8023? -