I got an oscilloscope and reverse engineered the protocol.
The w26 codebit sends serial binary data to the o28 LED matrix to show images. Each image is a sent as a 12.1ms burst of data.
When no data is being sent, the voltage of the data line is kept at 5V.
Data is sent as a series of eight-bit bytes. Each byte takes about 175.5μs. Each byte begins with 17.3μs at 0V, then eight bits, least significant bit first, each 17.3μs long, at 0V for 0 and 5V for 1, and finally a terminating 5V for 19.5μs.
The preamble is four bytes, 0x1C001140 (in network byte order). The postamble is one byte, 0x26. One or both of these probably change based on the configuration; I was testing a 1x1 matrix group (i.e. just one LED matrix).
The image data is 8 bit RGB. On the wire, the first two bits of each byte represent the blue signal, the next three bytes represent the green signal, and final three bytes represent the red signal. This is equivalent to the low two bits being blue, the next three bits being green, and the high three bits being red.
For reasons I cannot explain, pulses from the w26 codebit are sometimes arbitrarily longer than they should be according to the description above. The o28 LED matrix appears to either ignore bogus data or somehow correct for it; either way, I don't see glitches on the LED matrix when glitches happen in the output of the codebit.
I haven't tried reading or writing serial data via the cloudbit yet, but adding that feature to my localbit project is my next task. I'd be very curious as to whether anyone manages to get the arduino bit to control the LED matrix.