I'm currently trying to use a teensy to sample from an analog signal. I've been having problems transmitting the analog data to my program running on the computer (missing bytes, so when I reconstruct the 10-bit value the byte order is off and I get garbage), so I wrote a simpler version to test where it's going wrong.
It's based off the pjrc ADC auto sampler code
, and basically sends 16384 bytes every time a request byte is sent from the computer.
The computer program is based off the pjrc bandwidth test
: it now sends the request byte ('s' in this case), reads 16384 bytes, then checks to make sure every byte is in ascending order (with overflow handled). It's windows based, and I ran it off an ASUS 1201N 1.6GHz Win7 netbook.
I've included a sample output file generated from a run of the program, with the terminal output pasted at the top. I'm new to programming micro-controllers and serial ports but I'm thinking there must be a buffer overflow happening somewhere. I tried messing with the delay values between bytes sent on the teensy with no luck, I always end up losing data at some point. I've even changed several of the port parameters on the computer side without any difference.
I've been having trouble finding any information on this problem, so I'm thinking I'm doing something wrong somewhere in my program. I was wondering if anyone with more experience with serial programming could help me out.