Hi,
ALuint bf;
ALuint src;
int size;
short* data;
int sampling_rate;
size = maximum - minimum;
data = new short[maximum - minimum];
data = (short*)memcpy(data, &GetData()[minimum], size);
sampling_rate = GetSamplingRate();
alGenBuffers(1, &bf);
alBufferData(bf, AL_FORMAT_STEREO16, data, size * 2, sampling_rate);
alGenSources(1, &src);
alSourcei(src, AL_BUFFER, bf);
alSourcePlay(src);
This is a piece of code that is periodically called after a certain period of time (buffer length) until the entire track is played back. I'm using a thread to create the buffer cycle. The problem here is that when
short* GetData()
which returns the actual data, is played back entirely, everything is fine, but as soon as I start chopping it up, the sound becomes all messed up. I've been over and over it this time - this is the skeleton code to which I've hewn it down to: playback is 1) incontiguous - the waveform is severed at places, parts of it are skipped and parts of it are played at the wrong time (eg the beginning is played
again in the middle, etc.). I'm sure I'll have to admit the fault here is mine, but I cannot see it.
minimum
and
maximum
define the byte offsets where the buffer is to begin and end in the main data block. Is
&GetData()[minimum]
the correct way to copy data from
index ?
I've tried using various buffer sizes from 25 to 3000 ms - the bigger they are the better the playback results, but I'd like to get it working within the range of 10 to 500 ms.
Does anyone have any clues as to what I might be doing wrong? If necessary I'll post more code - the only problem is that there's not too few of it
Thanks in advance,
Crispy
[edited by - crispy on June 3, 2002 6:27:12 PM]
"Literally, it means that Bob is everything you can think of, but not dead; i.e., Bob is a purple-spotted, yellow-striped bumblebee/dragon/pterodactyl hybrid with a voracious addiction to Twix candy bars, but not dead."- kSquared