Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/android/194.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Android 将音频传输到Doorbird设备_Android_Audio_Retrofit_Byte_Intercom - Fatal编程技术网

Android 将音频传输到Doorbird设备

Android 将音频传输到Doorbird设备,android,audio,retrofit,byte,intercom,Android,Audio,Retrofit,Byte,Intercom,我正在尝试创建一个安卓应用程序,连接到Doorbird设备,我知道该公司的官方应用程序,但是,我需要更多适合我需要的功能 对于不知道什么是Doorbird设备的人来说,Doorbird是一款智能对讲机,是Doorbird公司的产品,该设备可以通过HTTP和RTSP向任何消费者(如Android系统)传输音频和视频,他可以获得音频流并播放,例如,从Android设备录制音频并将其传输给Doorbird。音频格式为G711 u-law 我能够从Doorbird接收到视频和音频流,它工作得很好,但我没

我正在尝试创建一个安卓应用程序,连接到Doorbird设备,我知道该公司的官方应用程序,但是,我需要更多适合我需要的功能

对于不知道什么是Doorbird设备的人来说,Doorbird是一款智能对讲机,是Doorbird公司的产品,该设备可以通过HTTP和RTSP向任何消费者(如Android系统)传输音频和视频,他可以获得音频流并播放,例如,从Android设备录制音频并将其传输给Doorbird。音频格式为G711 u-law

我能够从Doorbird接收到视频和音频流,它工作得很好,但我没有成功地将音频以u-law格式传输到Doorbird。 我得到的错误是 HTTP失败:java.net.ProtocolException:意外状态行:

我试图将从Doorbird获得的相同字节传输回Doorbird,但仍然是相同的错误

当然,我是根据他们发布的API工作的,但是关于协议传输音频的信息并不多。

有没有一个Android项目与Doorbird集成的例子

有人能帮忙把音频广播给门鸟吗

应该使用哪种协议

即使有人知道用任何其他工具和系统,而不仅仅是Android操作系统,将音频传输到Doorbird,我也会很感激

这就是我所尝试的,我收到了来自Doorbird的数据(正如我所说的,它是有效的),并等待了3秒钟,然后用改进的Libray将数据传输回Doorbird

    const val AUDIO_PATH =
"http://192.168.1.187/bha-api/audio-receive.cgi?http-user=XXXXXX0001&http-password=XXXXXXXXXX"
    override fun onCreate(savedInstanceState: Bundle?) {
    super.onCreate(savedInstanceState)
    setContentView(R.layout.activity_main)


    //InputStream inputStream = getResources().openRawResource(R.raw.piano12);
    val thread = Thread { this.playUrl() }
    thread.start()
    //val inStr = assets.open("doorbird_record")

}

private fun playUrl() {
    val inStr = URL(AUDIO_PATH).openStream()
    val buffer = ByteArray(1000)
    var i = 0

    //while (inStr.read(buffer).also { i = it } != -1) {


    Handler(Looper.getMainLooper()).postDelayed({
        //inStr.close()
        inStr.read(buffer)
        Log.d("DoorbirdLog", inStr.toString())
        val part = MultipartBody.Part.createFormData(
            "doorbirdStream", "doorbird", buffer.toRequestBody(
                ("audio/basic").toMediaType()
            )
        )
        //val rb = file.asRequestBody(("audio/*").toMediaType())
        val call = NetworkManager.instanceServiceApi.upload(part)
        call.enqueue(object : Callback<ResponseBody> {
            override fun onResponse(
                call: Call<ResponseBody>,
                response: Response<ResponseBody>
            ) {
                val i = response.body()
                Log.d("success", i.toString())
            }

            override fun onFailure(call: Call<ResponseBody>, t: Throwable) {
                Log.d("failed", t.message.toString())
            }
        })

    }, 3000)

}
private const val FREQUENCY_SAMPLE_RATE_TRANSMIT = 8000
private const val RECORD_STATE_STOPPED = 0

override suspend fun recordAndTransmitAudio(audioTransmitUrl: String) =
        withContext(Dispatchers.IO) {
            val minBufferSize = AudioRecord.getMinBufferSize(
                FREQUENCY_SAMPLE_RATE_TRANSMIT, AudioFormat.CHANNEL_IN_MONO,
                AudioFormat.ENCODING_PCM_16BIT
            )
            mRecorder = AudioRecord(
                MediaRecorder.AudioSource.VOICE_COMMUNICATION,
                FREQUENCY_SAMPLE_RATE_TRANSMIT, AudioFormat.CHANNEL_IN_MONO,
                AudioFormat.ENCODING_PCM_16BIT, minBufferSize
            )
            mRecorder?.let { enableAcousticEchoCanceler(it.audioSessionId) }
            mRecorder?.startRecording()

            val bufferShort = ShortArray(minBufferSize)
            val buffer = ByteArray(minBufferSize)

            val urlConnection = URL(audioTransmitUrl).openConnection() as HttpURLConnection
            urlConnection.apply {
                doOutput = true
                setChunkedStreamingMode(minBufferSize)
            }

            val output = DataOutputStream(urlConnection.outputStream)
            output.flush()

            try {
                mRecorder?.let { recorder ->
                    while (recorder.read(bufferShort, 0, bufferShort.size) != RECORD_STATE_STOPPED) {
                        G711UCodecManager.encode(bufferShort, minBufferSize, buffer, 0)
                        output.write(buffer)
                    }
                }
            }catch (e: Exception){
                Log.d(TAG, e.message.toString())
            }
            output.close()
            urlConnection.disconnect()
        }
const val音频路径=
"http://192.168.1.187/bha-api/audio-receive.cgi?http-用户=XXXXXX 0001&http密码=XXXXXXXXX“
重写创建时的乐趣(savedInstanceState:Bundle?){
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
//InputStream InputStream=getResources().openRawResource(R.raw.piano12);
val thread=线程{this.playUrl()}
thread.start()
//val inStr=资产。打开(“门鸟记录”)
}
私人娱乐网站{
val inStr=URL(音频路径).openStream()
val缓冲区=字节数组(1000)
变量i=0
//while(inStr.read(buffer).allow{i=it}!=-1){
处理程序(Looper.getMainLooper()).postDelayed({
//仪器关闭()
仪器读取(缓冲区)
Log.d(“DoorbirdLog”,inStr.toString())
val part=MultipartBody.part.createFormData(
“门鸟流”、“门鸟”、缓冲区.toRequestBody(
(“音频/基本”).toMediaType()
)
)
//val rb=file.asRequestBody(((“audio/*”).toMediaType())
val call=NetworkManager.instanceServiceApi.upload(部分)
排队(对象:Callback{
覆盖响应(
呼叫:呼叫,,
答复:答复
) {
val i=response.body()
Log.d(“success”,i.toString())
}
覆盖失效时的乐趣(调用:调用,t:可丢弃){
Log.d(“失败”,t.message.toString())
}
})
}, 3000)
}
以及改装实例:

@Multipart
@Headers( "Content-Type: audio/basic",
        "Content-Length: 9999999",
        "Connection: Keep-Alive",
        "Cache-Control: no-cache")
@POST("audio-transmit.cgi?http-user=XXXXXX0001&http-password=XXXXXXXXXX")
fun upload(@Part part: MultipartBody.Part): Call<ResponseBody>
@Multipart
@标题(“内容类型:音频/基本”,
“内容长度:9999999”,
“连接:保持活动状态”,
“缓存控制:无缓存”)
@POST(“音频传输.cgi?http用户=xxxxxx 0001和http密码=xxxxxxxxx”)
趣味上传(@Part:MultipartBody.Part):调用

非常感谢您的帮助

最终,我找到了一个解决方案,我将在这里为那些遇到与Doorbird集成尝试的人简要介绍该解决方案

    const val AUDIO_PATH =
"http://192.168.1.187/bha-api/audio-receive.cgi?http-user=XXXXXX0001&http-password=XXXXXXXXXX"
    override fun onCreate(savedInstanceState: Bundle?) {
    super.onCreate(savedInstanceState)
    setContentView(R.layout.activity_main)


    //InputStream inputStream = getResources().openRawResource(R.raw.piano12);
    val thread = Thread { this.playUrl() }
    thread.start()
    //val inStr = assets.open("doorbird_record")

}

private fun playUrl() {
    val inStr = URL(AUDIO_PATH).openStream()
    val buffer = ByteArray(1000)
    var i = 0

    //while (inStr.read(buffer).also { i = it } != -1) {


    Handler(Looper.getMainLooper()).postDelayed({
        //inStr.close()
        inStr.read(buffer)
        Log.d("DoorbirdLog", inStr.toString())
        val part = MultipartBody.Part.createFormData(
            "doorbirdStream", "doorbird", buffer.toRequestBody(
                ("audio/basic").toMediaType()
            )
        )
        //val rb = file.asRequestBody(("audio/*").toMediaType())
        val call = NetworkManager.instanceServiceApi.upload(part)
        call.enqueue(object : Callback<ResponseBody> {
            override fun onResponse(
                call: Call<ResponseBody>,
                response: Response<ResponseBody>
            ) {
                val i = response.body()
                Log.d("success", i.toString())
            }

            override fun onFailure(call: Call<ResponseBody>, t: Throwable) {
                Log.d("failed", t.message.toString())
            }
        })

    }, 3000)

}
private const val FREQUENCY_SAMPLE_RATE_TRANSMIT = 8000
private const val RECORD_STATE_STOPPED = 0

override suspend fun recordAndTransmitAudio(audioTransmitUrl: String) =
        withContext(Dispatchers.IO) {
            val minBufferSize = AudioRecord.getMinBufferSize(
                FREQUENCY_SAMPLE_RATE_TRANSMIT, AudioFormat.CHANNEL_IN_MONO,
                AudioFormat.ENCODING_PCM_16BIT
            )
            mRecorder = AudioRecord(
                MediaRecorder.AudioSource.VOICE_COMMUNICATION,
                FREQUENCY_SAMPLE_RATE_TRANSMIT, AudioFormat.CHANNEL_IN_MONO,
                AudioFormat.ENCODING_PCM_16BIT, minBufferSize
            )
            mRecorder?.let { enableAcousticEchoCanceler(it.audioSessionId) }
            mRecorder?.startRecording()

            val bufferShort = ShortArray(minBufferSize)
            val buffer = ByteArray(minBufferSize)

            val urlConnection = URL(audioTransmitUrl).openConnection() as HttpURLConnection
            urlConnection.apply {
                doOutput = true
                setChunkedStreamingMode(minBufferSize)
            }

            val output = DataOutputStream(urlConnection.outputStream)
            output.flush()

            try {
                mRecorder?.let { recorder ->
                    while (recorder.read(bufferShort, 0, bufferShort.size) != RECORD_STATE_STOPPED) {
                        G711UCodecManager.encode(bufferShort, minBufferSize, buffer, 0)
                        output.write(buffer)
                    }
                }
            }catch (e: Exception){
                Log.d(TAG, e.message.toString())
            }
            output.close()
            urlConnection.disconnect()
        }
  • 首先,我们将为记录和传输准备必要的参数
  • 我们得到了用于记录的缓冲区的最小大小
  • 定义要记录的对象
  • 启动回音消除
  • 然后开始录音
  • 打开与传输URL的连接
  • 只要录制未停止,则执行While循环
  • 将我们记录的数据从PCM 16位格式编码为G.711μ-律格式
  • 当然,在我们完成录音后,我们清理了资源