|Issue 2073||Ideas for more automated audio NACK tests|
|Starred by 9 users||Project Member Reported by firstname.lastname@example.org, Jul 11 2013||Back to list|
This is a follow up to bug 1601 . See also bug 2072 for ideas for future manual testing. Turaj added unit tests as part of his audio NACK implementation. We brainstormed about other ideas for automated tests. I'm not sure if these tests could live somewhere in the VoE. a) drop some audio packets, call the NACK API, and verify that those dropped packets are in the list the API returns b) drop an audio packet, NACK the packet and say that you need the packet in X ms. Have the sender resend/inject the packet in time, then query NetEQ for packet loss metrics. NetEQ should report that there is zero packet loss. c) drop an audio packet, NACK the packet and say that you need the packet in X ms. Have the sender resend/inject the packet too late, then query NetEQ for packet loss metrics. NetEQ should report that there is non-zero packet loss, and if it supports it, NetEQ should also report that a non-zero dropped packet value. Andrew, let's talk about these ideas when you're back.
Jul 15 2013,
I think it would be instructive to know what kind of testing (if any) is done for video NACK. Added Patrik, Mikhal and Stefan for details.
Oct 15 2013,
For video we have several unittests verifying that nack lists contain the right packets and that only complete frames are decoded. We also try to make sure that we verify corner cases, such as sequence number wraps. We also have fairly simple integration tests. (https://code.google.com/p/webrtc/source/browse/trunk/webrtc/video_engine/test/call_tests.cc#284) I would recommend that you unittest ACM with nack enabled to be able to also verify NetEq behavior. I would also add a voice engine integration test.
Oct 15 2014,
Oct 30 2014,
Apr 20 2015,
Nov 24 2015,
|► Sign in to add a comment|