How to play custom stream #348
-
We would like to play an H264 or H265 elementary stream (like streamed in RTSP protocol) but protocol is different (proprietary) When we use custom stream we're getting error or player does nothing. But video is exact same video which included within RTSP. So what we should do to be able to play it? I want to clarify with examples. protected Stream? customInput;
// Sample using a 'custom' IO stream
customInput = new FileStream(filePath, FileMode.Open);
player!.OpenAsync(customInput); Or we open an RTSP link with FlyLeaf like this player!.OpenAsync(url); But let's say, for example, the file I'm trying to open in the example below is coming to me piece by piece from somewhere else. customInput = new MemoryStream();
player!.OpenAsync(customInput);
while ((bytesRead = await [incoming data piece]))
{
// write the read data to customInput
await customInput.WriteAsync(buffer, 0, bytesRead);
} Thanks in advance. |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 1 reply
-
Hi @fatihbahceci, you will need to create a custom System.IO.Stream and override Read method at least. Here you can find a sample class from another user. In this case you will be able to see when the FFmpeg demuxer requests data and check if you already retrieve them or you will need to wait until you will have them. |
Beta Was this translation helpful? Give feedback.
-
Hello, first of all, thank you very much for your response. Based on your answer, I prepared a class as below. This class stores the incoming data and forwards it to the player. using System.Collections.Concurrent;
using System.Diagnostics;
namespace OnTheFlyTest.Core.Winform.Common
{
internal class FBCStream : Stream
{
private ConcurrentQueue<byte[]> data = new ConcurrentQueue<byte[]>();
private byte[]? remainingData = null;
private int remainingDataOffset = 0;
public event EventHandler<string>? OnTrace;
protected void trace(string s, params object[] prms)
{
if (OnTrace != null)
{
//try { s = string.Format(s, prms); } catch { }
//OnTrace?.Invoke(this, s);
}
}
public FBCStream()
{
}
public override bool CanRead => throw new NotImplementedException();
public override bool CanSeek => false;
public override bool CanWrite => true;
public override long Length => 0;
public override long Position { set { } get { return 0; } }
public override void Flush()
{
trace("Flush");
}
public override int Read(byte[] buffer, int offset, int count)
{
trace($"Read offset: {offset} count: {count}");
int read = 0;
Stopwatch sw = new Stopwatch();
sw.Start();
while (read <= 0 && sw.ElapsedMilliseconds < 3000)
{
if (remainingData != null)
{
int remainingDataLen = remainingData.Length - remainingDataOffset;
if (remainingDataLen > 0)
{
int copyLen = count;
if (copyLen > remainingDataLen)
{
copyLen = remainingDataLen;
}
Array.Copy(remainingData, remainingDataOffset, buffer, offset, copyLen);
remainingDataOffset += copyLen;
read += copyLen;
}
if (remainingDataOffset >= remainingData.Length)
{
remainingData = null;
remainingDataOffset = 0;
}
}
while (read < count && ((remainingData != null && remainingData.Any()) || (this.data.Any())))
{
byte[]? data;
if (this.data.TryDequeue(out data))
{
int copyLen = count - read;
if (copyLen > data.Length)
{
copyLen = data.Length;
}
Array.Copy(data, 0, buffer, offset + read, copyLen);
read += copyLen;
if (copyLen < data.Length)
{
remainingData = data;
remainingDataOffset = copyLen;
}
}
}
}
trace($"Read {read} bytes");
return read;
}
public override long Seek(long offset, SeekOrigin origin)
{
trace("Seeking " + offset.ToString() + "," + origin.ToString());
return 0;
}
public override void SetLength(long value)
{
trace("SetLength");
}
public override void Write(byte[] buffer, int offset, int count)
{
trace($"Write offset: {offset} count: {count}");
if (count > 0)
{
byte[] data = new byte[count];
Array.Copy(buffer, offset, data, 0, count);
this.data.Enqueue(data);
}
}
protected override void Dispose(bool disposing)
{
trace("Dispose");
data?.Clear();
remainingData = null;
base.Dispose(disposing);
}
}
} To test the class, I first tried reading a file. And it worked quite well. The while block is used to simulate the incoming data from a file as if it were coming from somewhere else. public override async Task StartAsync()
{
customInput = new FBCStream();
using FileStream fileStream = new FileStream(filePath, FileMode.Open, FileAccess.Read, FileShare.Read, 4096, true);
byte[] buffer = new byte[4096];
bool firstBytesReceived = false;
int bytesRead;
//The following while block is used to simulate the incoming data from a file as if it were coming from somewhere else.
while ((bytesRead = await fileStream.ReadAsync(buffer, 0, buffer.Length)) > 0)
{
if (!firstBytesReceived)
{
trace("FirstBytesReceived");
firstBytesReceived = true;
player!.OpenAsync(customInput);
}
// Write the read data to customInput
await customInput.WriteAsync(buffer, 0, bytesRead);
}
CallOnFinishedEvent();
} Then I tried to receive data from an RTSP server using the RtspClientSharp library. Should I make any special settings related to RTSP? Can you also help me with this? Thanks in advance. public override async Task StartAsync()
{
isFirstReceive = false;
trace("startRTSP");
customInput = new FBCStream();
(customInput as FBCStream)!.OnTrace += (s, e) => trace(e);
rtspClient = new NDRTSPClient(rtspLink);
rtspClient.OnFrameReceive += rtspClient_OnFrameReceive;
rtspClient.OnDisconnect += rtspClient_OnDisconnect;
rtspClient.OnConnect += rtspClient_OnConnect;
rtspClient.OnTrace += rtspClient_OnTrace;
await rtspClient.Open();
}
...
...
private void rtspClient_OnFrameReceive(object? s, RawFrame e)
{
//trace("NDRTSPClient.OnFrameReceive: {0}", e.Type);
if (e != null)
{
if (e.Type == FrameType.Video && e is RawH264IFrame rawH264IFrame)
{
customInput!.Write(rawH264IFrame.SpsPpsSegment.Array!, rawH264IFrame.SpsPpsSegment.Offset, rawH264IFrame.SpsPpsSegment.Count);
//totalBytes += rawH264IFrame.SpsPpsSegment.Count;
}
customInput!.Write(e.FrameSegment.Array!, e.FrameSegment.Offset, e.FrameSegment.Count);
//totalBytes += e.FrameSegment.Count;
if (!isFirstReceive/* && totalBytes > 3000*/)
{
isFirstReceive = true;
trace("First frame received");
player!.OpenAsync(customInput);
}
}
else
{
trace($"Null Frame Received - Client Id: {(s as NDRTSPClient)!.ClientId}");
}
} |
Beta Was this translation helpful? Give feedback.
-
@fatihbahceci As I am not familiar with elementary streams (from a quick look it just means codec data without a format context?) it would help if you can create a sample project so I can check it (you can drop me an email if you don't want to share it in public). Tbh, I don't like the idea that you use another rtsp client to achieve this. FFmpeg can do it for you I guess, how can you do it with ffplay (or ffmpeg)? Maybe if you could set the extension to .264 or .bin or to force the format it would be easier? (PS: logs would help too, try to disable VideoAcceleration in case of a single image, just fixed an issue that might be the reason you could not see anything) |
Beta Was this translation helpful? Give feedback.
-
Hello! The problem has been resolved with the new update you made. Thank you so much for your prompt response and commit. I would also like to mention that the FBCStream class I wrote for this job, as I pointed out in my previous reply can be used as an example by people who want to do tasks like mine, and they can make this class more functional. After the new update, I saw that the problem was solved without making any changes. Thank you again. |
Beta Was this translation helpful? Give feedback.
Hello, first of all, thank you very much for your response. Based on your answer, I prepared a class as below. This class stores the incoming data and forwards it to the player.