Already tried a bunch of options. I get WaveIn bytes with voice over the network and are transferred to this function:

public static void Broadcast(byte[] data) { waveProvider.AddSamples(data, 0, data.Length); //TODO (save to) Environment.CurrentDirectory + @"\out.mp3" } 

They come about a few dozen times per second. The question is: is it possible to somehow convert these bytes into an MP3 frame (Mp3Frame) [in priority] or at least somehow save this piece of sound to disk? Went through different options, all are not suitable. Now the sound is in the form:

 internal static WaveOut waveOut = new WaveOut(); internal static BufferedWaveProvider waveProvider = new BufferedWaveProvider(new WaveFormat(8000, 2)); 

It sounds in my speakers, because I use

 waveOut.Init(waveProvider); waveOut.Play(); 

I will consider any options. Thank.

    1 answer 1

    what happened and somehow works

     public static void Broadcast(byte[] data) { r = new Mp3FileReader(WavToMP3(data)); while ((frame = r.ReadNextFrame()) != null) { foreach(Consumer c in WebCast.Clients) c.Audio(frame.RawData); Console.Title = frame.FrameLength.ToString(); } } public static MemoryStream WavToMP3(byte[] wavFile) { using(var retMs = new MemoryStream()) using(var ms = new MemoryStream(wavFile)) using(var rdr = new RawSourceWaveStream(ms, new WaveFormat(44100, 16, 1))) using(var wtr = new LameMP3FileWriter(retMs, rdr.WaveFormat, 128)) { rdr.CopyTo(wtr); wtr.Flush(); return new MemoryStream(retMs.ToArray()); } } 
    • You have a BufferedWaveProvider format (8000.2), and in RawSourceWaveStream you transfer (44100, 16, 1), is this correct? - MSDN.WhiteKnight
    • at the moment, buffered was removed from the code altogether, because it was for output via Play (), which I don’t use anymore - Kirill Poroh
    • @ MSDN.WhiteKnight there may be a conversion problem here? as I understand, mp3 has a different sampling rate and number of frames is also different, and I indicated its value here. when installing 128 frames for a wav channel, it displays an error that the maximum is 16, which should now be the default. I do not quite understand the principle of whether this is the case. using (var rdr = new RawSourceWaveStream (ms, new WaveFormat (44100, 2))) using (var wtr = new LameMP3FileWriter (retMs, rdr.WaveFormat, 128)) I still can not understand at what stage it crashes: when broadcasting to the network or when converting - Kirill Poroh
    • when trying to write to the wave buffer to listen to mp3 bytes of the received audio, waveOut outputs white noise - Kirill Poroh February
    • So what kind of format do you have in the byte[] data array? 44100 Hz mono or 8000 Hz stereo? Try to write to the beginning not in MP3, but in a WAV file and ensure that the result is reproduced normally by the player. - MSDN.WhiteKnight