Sample based decoding is slow - why not transition based?

General Topics about DSView software
I guess this is inherited from Sigrok, but protocol decoding is extremely slow (I tried UART). Don't know if it's the decoder script implementation, the API or the application, but it looks like every sample is checked in a loop instead of jumping from transition to transition. So a few UART bytes in one second take minutes to decode with 128M samples.
IMHO a transition based decoding approach could increase the speed massively. E.g. Ikalogic ScanaStudio (Scanaplus) decodes the very same signal with 100MHz resolution immediately (and the decoder script are JavaScript, so it's probably not a very efficient implementation but just a better approach).
And for the meantime: why not at least display what has already been decoded during decoding?
0xdeadbeef
 
Posts: 8
Joined: Thu Feb 25, 2016 2:31 am

The protocol decoding is implemented by python script.
You are right, under current implementation, every sample will be checked.
In most of case, this will not be the problem. you just need to setting a reasonable sample rate.
For UART signal, a 1MHz sample rate is enough.
Under the same sample time, 1M depth@1MHz is almost equal to 128M depth@100MHz
But the decoder will have 100x speedup.

We know this is not the best way. But most of the case, you can avoid the problem.
And we will also take time to improve this.

Thanks.
Andy
Site Admin
 
Posts: 149
Joined: Fri Jul 11, 2014 9:20 am


Return to General Discussion

Who is online
Users browsing this forum: No registered users and 1 guest