I inherited a DI-1100 and started to play with it using the available doc (manual and command protocol), but found several inconsistencies with what I saw in the device itself. It's also at the latest firmware version. For example:
Moving between libusb and CDC mode
- I used the instructions here
(holding down button within 5 seconds of start), but it didn't work.
- Turns out I had to follow the instructions for the DI-2008 (successively pressing the button twice a second). That worked.
- According to the documentation, the value of dec is fixed at 1. However, I was able to set it up to 512 and it did reflect in the rate at which I received the info from the device.
- Proof below
binary stream output format
- According to the doc, B2 and B3 of the first byte for *any* channel should always be 0. That is not what I saw.
- info 1 1100
- info 2 89
- Entering "dec 512".
- send command "dec" (no value)
- echoes the following: dec 512
Nothing plugged in
- First byte: 11111000 b'\xf8'
- Second byte: 11111111 b'\xff'
using filter 3 (minimum value) in ASCII mode (to disregard the potential binary discrepancies I see), I get -0.0024 as a value when nothing is plugged in. This is ok by itself, except for the fact that with 12 bits resolution in two's complement, the greatest granularity I should get is -0.0048, which is twice what I'm reading. Is the actual resolution 13 bits? or even 14, given that I see B2 and B3 change (as above)?
I'm not sure what's going on. It doesn't look I have a DI-2008, as the "libusb<->CDC" switch would suggest, as I definitely have D0 and D1 at zero (nothing on those ports), yet
At this point, it's hard for me to know exactly what the specs are for this device given all the above. Any help and insights would be appreciated!