sirket wrote:16 bits for 9 is hardly a big deal on the USB bus.
As for the logic side- If they are, in fact, FPGA's with no buffers then I am definitely confused.
It's basically a parallel to USB connection- no need for processor cores or anything else. Still- I haven't bothered to design one so I probably shouldn't criticize.
We are drifting way off topic now but basically USB is a complex protocol which needs some processing in order to service. At it's simplest a parallel to USB converter needs to be able to interpret packets on the USB bus and respond with the correct set-up data (device and configuration descriptors), then process bulk transfers via an endpoint. There is also some protocol management stuff like checking CRCs and doing re-transmits if required. The Logic doesn't have much of a buffer itself either so there needs to be additional code to manage the buffer and signal buffer overruns if the USB interface can't keep up.
Maybe you could do all that as pure logic on an FPGA but I am unaware of any designs which do. They all have some kind of code execution because it's just so much easier to do that way.
Going back to the point about 9 bits padded to a 16 bit word, sure it's not hard to do but you are missing the point. Since it's either a choice of wasting the other 7 bits or going to all the extra effort of packing/unpacking 9 bit words you might as well just use all 16 bits. That is why most logic analysers have multiples of 8 inputs. Sorry if I'm repeating myself but I really don't think it's that difficult to understand.
I agree completely wrt to visualizing waveforms. As for the digital versus analog- digital is simple enough- almost every time I run into problems it's because of analog characteristics present in the digital waveform- ringing, bouncing, etc.
No offence but your inability to grasp why logic analysers tend to have multiples of 8 inputs suggests otherwise.
To give an example I have found my Logic very handy when working on timing issues or emulating existing but poorly documented protocols. For example I recently improved compatibility with some models of the Playstation 2. Of course there is no official documentation for the controller protocol in the public domain but based on what people have discovered and my own readings taken from a real PS2 controller I had a working system. For some reason a few models of PS2 and one type of PS2 to USB converter didn't work though. It turns out that it's because those devices were starting the next byte too soon and not leaving enough time for the controller to ACK the last one. Sony PS2 controllers work because they can ACK and read/write bytes at the same time, but my hardware and several third party controllers could not. It seems that Sony realised this too as after two revisions that had the problem (and one of them on only one of the two ports for some reason) they fixed it. Fortunately I was able to modify my code.